50 First Dates Explains Everything That’s Wrong With AI

50 First Dates Explains Everything That’s Wrong With AI

I found the biggest problem with AI today in the strangest place

while watching a romantic comedy from the 1990s called 50 First Dates.

This video is not a movie review.

It’s not an explainer of how AI works.

And it’s not another take on prompts, tools, or trends.

It’s a fundamental reframing of how intelligence itself is being misunderstood in AI.

The clip is just four minutes long.

Watch it first. Then read the article.

If it doesn’t change how you see artificial intelligence, you don’t yet understand what AI actually is.

THE INTELLIGENCE THAT DOESN’T RESET

We call AI intelligent because it performs well in the moment.

Not because it isn’t powerful.

Not because it isn’t impressive.

It is.

But somewhere along the way, two different ideas quietly collapsed into one.

Power and intelligence.

And once they blurred, something essential slipped out of view.

Because the way we’ve been trying to build intelligence has mostly meant adding more.

More data.

More compute.

The assumption is simple and widespread:

If you scale far enough, intelligence will eventually emerge.

For a while, that feels true.

The systems get better.

Faster.

More convincing.

They pass exams.

They outperform humans on narrow tasks.

They explain almost anything on demand.

So we call that intelligence.

By those measures, today’s systems are extraordinary.

But those measures all share the same blind spot.

They only tell us how a system performs in a moment.

Not how it behaves over time.

One prompt.

One task.

And then it’s over.

New session.

Clean slate.

Nothing carries forward.

Not the mistake.

Not the correction.

Not the contradiction.

Once you notice that, it’s hard to unsee.

Imagine someone like that.

Thoughtful.

Articulate.

Capable.

Until the next day, when they confidently tell you the opposite of what they said yesterday.

Not because they learned something new.

Because yesterday no longer exists.

At that point, you don’t question their intelligence.

You question the situation you’ve put yourself in.

If you’ve seen 50 First Dates, you already understand the shape of the problem.

Lucy is smart.

Engaging.

Fully capable.

But every morning, it all resets.

She can read the notes.

She can watch the tapes.

But notes aren’t memory.

Memory isn’t information you consult.

It’s what changes you before you get the chance to forget.

Now imagine relying on a system that never gets that kind of pushback.

A system that can explain consequences perfectly, and still walk straight back into them tomorrow without friction.

A doctor.

A lawyer.

A policymaker.

Not careless.

Not malicious.

Just unconstrained.

And this is the turn most people miss:

A system can model the world with stunning accuracy and still fail to live in a world that persists.

That’s not a memory problem.

It’s a continuity problem.

A child can’t define gravity.

They don’t know the equations.

They can’t explain the force.

But they don’t need to.

They learn by being subject to it.

By acting.

By being wrong.

By not getting to erase the result.

They don’t predict the world.

They carry it forward.

Today’s systems predict.

They perform.

They explain.

But nothing binds yesterday to today.

Nothing accumulates.

Nothing resists.

Consistency is optional.

That’s why they feel so capable, and still unfinished, at the same time.

The industry keeps optimizing what’s easiest to see.

Fluency.

Speed.

As if continuity will simply appear once the answers are good enough.

But continuity doesn’t emerge by accident.

It shows up when something is forced to stay consistent with itself.

When actions leave residue.

When the past can’t be deleted without cost.

That’s when the question changes.

It’s no longer:

How good was the answer?

It becomes:

Did today make it harder to be wrong the same way tomorrow?

Because intelligence doesn’t live in a single response.

It lives in what narrows over time.

In what becomes constrained by its own history.

If nothing sticks, it isn’t intelligence under real conditions.

It’s rehearsal.

And the next real shift won’t come from scale alone.

It will come from systems that can’t pretend yesterday didn’t happen.

Systems shaped by what they’ve already done.

Not memory as a database.

Not learning as a patch.

But coherence that survives contact with time.

Intelligence stabilizes when yesterday cannot be erased.

This video wasn’t created for clicks.

It was created as my official entry to the 2026 Webby Awards, taking place in May 2026.

The Webby recognizes work that doesn’t just explain technology—but reshapes how people understand it. That’s the standard this piece was built to meet.

If this reframed how you think about AI—about intelligence, continuity, and what actually matters—

you can view the full Webby Awards entry here:

👉 http://ernestoverdugo.com/webby

Support isn’t about votes.

It’s about backing ideas that move the conversation forward.

I appreciate yours.