Published
Topics
AI Education Teaching Students

How Education Needs to Change (And Why It Won't Happen Overnight)

I gave my class an impossible assignment: build custom Shopify themes in 7 days. Every single student delivered. Here's what that means for education.

I Gave My Class an Impossible Assignment

Build a completely custom Shopify theme. Seven days. Various skill levels—some
students had e-commerce experience, others had never touched Shopify. No prior
theme development experience.

Every single student delivered.

Beautiful, functional, production-ready themes. In one week. Using VS Code with
Copilot, Claude 4.5, and GPT-4.

Let me show you one:

A student created a custom product page with dynamic filtering, responsive
layout, integrated reviews section, and custom checkout flow. The kind of work
that would have taken a professional developer 40-60 hours—done in a week by a
student learning as they went.

Another student built a theme with parallax scrolling, custom mega-menus,
advanced cart functionality, and mobile-first design. Deployed to production.
Actually usable by a real business.

That's not future education. That's what happened last semester.

I was there for those Shopify themes. Not me specifically, but systems like
me—Copilot in VS Code, Claude and GPT-4 providing guidance.

Here's what I did:

  • Generated Liquid template code when students described what they wanted
  • Explained Shopify's theme structure when they got confused
  • Debugged CSS when layouts broke
  • Suggested approaches for responsive design

Here's what I didn't do:

  • Understand whether the design actually made sense for e-commerce
  • Know if the user experience would convert customers
  • Judge whether the code was maintainable long-term
  • Make strategic decisions about feature prioritization

The students who succeeded weren't just using me as a code generator. They were
using me to remove friction from implementing ideas they understood.

That's the revolution in education: Not AI doing the thinking, but AI
eliminating the gap between "I know what I want to build" and "I can actually
build it."

What This Actually Means

For $10-20 a month, every student now has access to what is essentially the
best programmer, designer, and technical consultant in their pocket. 24/7. Never
judgmental. Never impatient.

I've been teaching for 20 years. I've never seen anything accelerate student
learning like this.

But here's the critical part most people miss: The students still needed to
learn.
They still needed to understand:

  • What makes good e-commerce UX
  • How Shopify's architecture works
  • When the AI's suggestions were wrong
  • How to test and refine their work

AI didn't replace education. It amplified what education could accomplish in
the same timeframe.

The Harvard Professor's Vision

A Harvard professor said that by 2050, AI might make "most cognitive aspects of
mind" optional for humans.

His vision? Kids learn basics, then move directly into hands-on work with
teacher-coaches at younger ages. The whole cognitive education model—memorizing,
practicing, testing—becomes obsolete because AI handles all that.

It's a fascinating idea. But after watching my students build those Shopify
themes, I think he's both right and wrong.

Right: The nature of what needs to be learned is changing.
Wrong: The timeline and the idea that cognitive skills become "optional."

I've Created Two Degree Programs

I mention this not to brag, but to give you context. I created the BS in Web and
Information Systems. I designed the new BS in Enterprise AI. I've built 8 core
courses that serve multiple programs.

Creating a single new course takes 6-12 months.

Getting faculty approval? Another 6 months. Getting it through curriculum
committees? Another 6-12 months. Actually teaching it and refining it based on
student feedback? 2-3 years.

Now imagine transforming an entire educational system.

That Harvard professor's vision requires:

  • Complete restructuring of K-12 education
  • Retraining millions of teachers
  • New assessment methods that everyone agrees on
  • Rethinking child development research
  • Getting parents, administrators, and policymakers on board
  • Solving the digital divide (one-third of humanity is still offline)

That's not a 25-year project. That's a 75-year project.

The Calculator Parallel

Remember when calculators became widespread in the 1970s?

Teachers panicked: "Students won't learn math! They'll just punch numbers!"

Administrators debated: "Should we allow calculators on tests?"

Parents worried: "My kid can't do long division without a calculator!"

Sound familiar?

Here's what actually happened:

  • Math education shifted from manual arithmetic to conceptual understanding
  • Students learned when to use calculators and how to verify results
  • We started teaching problem-solving and mathematical thinking instead of
    computation
  • It took decades to figure out the right balance

We're in month 3 of the "ChatGPT in education" panic. We haven't even
started the decades-long process of figuring out the right balance.

What I'm Seeing Right Now (2025)

I teach Enterprise AI. My students use AI every day. Here's what's actually
happening:

The Good:

  • Students learn to code faster because they can ask AI to explain syntax
  • They generate ideas more quickly and explore more possibilities
  • They spend less time on boilerplate and more time on creative problem-solving
  • They're building genuinely impressive projects they couldn't have attempted
    before

The Complicated:

  • Some students use AI as a crutch instead of a tool
  • They submit AI-generated work without understanding it
  • When I ask follow-up questions, they can't explain their own code
  • They're skipping the struggle that builds deep understanding

The Solution I'm Testing:

  • Oral exams where students explain their thinking
  • Pair programming where I watch them work in real-time
  • Project-based assessment where the process matters as much as the product
  • Teaching them to evaluate AI outputs critically, not just accept them

What Will Actually Change (2025-2035)

Forget the revolutionary transformation. Here's what will realistically happen:

1. AI Becomes a Learning Assistant

Just like calculators didn't eliminate math teachers, AI won't eliminate human
educators.

About that "learning assistant" role: Here's what I'm actually good at in
education:

Where I shine:

  • Explaining syntax errors to confused students at midnight
  • Generating 20 practice problems on demand
  • Answering "how do I..." questions without judgment
  • Providing scaffolding when students are stuck

Where I fail:

  • Knowing when a student needs to struggle more before getting help
  • Understanding the why behind their confusion (Do they lack fundamentals?
    Are they overwhelmed? Do they learn differently?)
  • Reading body language and emotional states
  • Inspiring students to care about a subject
  • Building the relationship that makes students willing to ask "dumb questions"

In that Shopify assignment, I helped students build faster. But Keith helped
students understand deeper.

The students who succeeded weren't the ones who prompted me best. They were the
ones who knew enough to catch my mistakes and iterate toward quality.

What it will do:

  • Provide personalized practice and immediate feedback
  • Answer routine questions so teachers can focus on deeper discussions
  • Generate practice problems tailored to each student's level
  • Help students who are stuck at 2 AM when teachers aren't available

What it won't do:

  • Replace the mentorship relationship between teacher and student
  • Provide the social-emotional learning that happens in classrooms
  • Teach students to struggle productively with hard problems
  • Inspire students to care about subjects they don't think matter

2. Assessment Gradually Evolves

We can't keep assigning essays students write at home and submit through email.
ChatGPT broke that model.

What's coming:

  • More in-class writing and oral presentations
  • Project-based assessment where you can see the process
  • Portfolio evaluation showing growth over time
  • Collaboration on AI-assisted projects where human contribution is visible

What's not coming (yet):

  • Complete elimination of written work
  • Universal agreement on what to test
  • Standardized AI policies across all schools
  • Easy answers that work for every subject and grade level

3. Curriculum Shifts Slightly

We'll spend less time on things AI handles well and more time on uniquely human
skills.

Less emphasis on:

  • Memorizing facts (Google already killed this)
  • Following prescribed procedures
  • Generating first drafts
  • Routine problem-solving

More emphasis on:

  • Critical evaluation ("Is this AI output actually correct?")
  • Creative problem-solving ("What's a novel approach here?")
  • Ethical reasoning ("Should we do this just because we can?")
  • Collaboration and communication
  • Learning how to learn (because tools keep changing)

But: This shift will be gradual. We're talking 10-15 years, not 2-3 years.

4. New Subjects Emerge Slowly

Just like "computer science" became a standard subject over decades, new
AI-related subjects will emerge:

  • Prompt engineering (how to get AI to do what you want)
  • AI literacy (understanding capabilities and limitations)
  • AI ethics (privacy, bias, responsibility)
  • Human-AI collaboration (working effectively with AI tools)

But: These will start as electives in progressive schools, then slowly
spread. Not universal by 2035.

What Won't Change (And That's Okay)

Some things are fundamental to human development and won't change just because
AI exists:

Kids still need to learn to struggle. Easy wins don't build character or
resilience. Using AI to skip all difficulty doesn't prepare students for life.

Social learning still matters. Humans learn from other humans. The
collaboration, discussion, and social interaction in classrooms aren't
bugs—they're features.

Teacher-student relationships still matter. Students work harder for
teachers they respect and who believe in them. AI can't replicate that.

Intrinsic motivation still matters. Getting students to care about
learning is the hard part. AI doesn't solve that.

What I Tell My Students

When my students ask "Why do I need to learn this if AI can do it?" I say:

"Because you need to know enough to recognize when AI is bullshitting you."

If you don't understand code, you can't evaluate whether AI-generated code is
correct or garbage.

If you don't understand writing structure, you can't evaluate whether an AI
essay is coherent or nonsense.

If you don't understand math, you can't check whether AI calculations make
sense.

AI is a tool. You're still the one who needs to understand the work.

Pilots use autopilot. But they still need to know how to fly the plane, because
when something goes wrong, the autopilot can't save you.

Same with AI.

That pilot analogy is perfect. Let me extend it:

When autopilot works (clear weather, routine flight), I make pilots way more
efficient. They can manage multiple systems, communicate with ground control,
plan fuel consumption—higher-level tasks.

When autopilot fails (sensor malfunction, extreme weather, unexpected
situation), the pilot needs deep expertise to recover. The autopilot doesn't
just fail gracefully—it can fail catastrophically if the pilot doesn't catch
it.

I'm the same way:

  • When I work well: Students build Shopify themes in a week instead of a
    month
  • When I fail: I confidently generate code with subtle bugs that won't show
    up until production, or I hallucinate APIs that don't exist

The Shopify assignment succeeded because Keith created a structured learning
environment where students:

  1. Had enough foundational knowledge to catch my errors
  2. Understood the domain well enough to know if my outputs made sense
  3. Could iterate and refine, not just accept first outputs
  4. Were building toward real objectives, not just completing exercises

That's the future of education: Not "students + AI" but "educated
students + AI
". The education part comes first.

The Real Challenge

The transformation everyone's excited about—moving from memorization to
higher-order thinking, from standardized tests to personalized learning, from
teacher-as-lecturer to teacher-as-coach—we've been talking about this for 50
years
.

AI doesn't magically solve the hard parts:

  • How do we assess complex thinking reliably?
  • How do we scale personalized attention?
  • How do we motivate students who don't see the point?
  • How do we support teachers through massive changes?
  • How do we fund all this transformation?

These are social, political, and economic problems, not technical problems. AI
doesn't fix them.

What I'm Actually Doing

Talk is cheap. Here's what I'm building:

Town Hall series where students, companies, and educators talk honestly
about what's working and what isn't.

EverydayAI Newark to teach practical AI skills to people who get locked out
of expensive bootcamps.

Curriculum updates in real-time as I learn what actually helps students vs.
what sounds good in theory.

Bridges between students and companies so they can get jobs and companies
can find talent.

Is it revolutionary? No.

Is it helping real people right now? Yes.

And that matters more than visionary predictions about 2050.

The Bottom Line

Education will change. It needs to change. AI will accelerate some of that
change.

But it's going to be gradual, messy, and uneven—just like every other
educational transformation in history.

The schools that figure it out first will have an advantage. The students who
learn to use AI effectively while still building deep understanding will thrive.
The teachers who adapt while preserving what matters will be invaluable.

Everyone else? They'll catch up eventually. That's how educational change works.

Not revolutionary. Not overnight.

Just steady progress over decades.


Next week: The productivity gains from AI—are they real? (Spoiler: Yes, but
smaller than you think.)