Screen Zombies at Lunch, AI at the Gate

by | Apr 18, 2025

There he was—a preteen kid, hunched over a laptop in the middle of a family lunch, hammering away at a low-rent Mario knockoff like his life depended on it. He didn’t touch his food. Didn’t speak. Didn’t look up. When the battery finally gave out, his mother passed him her phone like a communion wafer. Bam—back in the digital trance.

No eye contact. No conversation. Just screen glow and silence.

This didn’t start yesterday. We’ve been plugging in and tuning out for more than a decade. That boy wasn’t just reacting on instinct—he was learning. From the adult beside him. From the world around him. From the culture that taught his mother that a screen was safer, easier, quieter than engaging.

It’s the new pacifier.
Only it doesn’t just quiet the child. It rewires them.

What used to be a moment of learning or connection has become a moment of surrender—to the machine, the algorithm, the dopamine drip of digital distraction. And the scariest part? Most parents don’t even see it happening. They’re doing the same thing.

We’ve created a loop. The child learns from the parent. The parent is overwhelmed, under-supported, too often exhausted from a system that no longer treats parenting as a communal responsibility. Toss a tablet in their hands. Problem solved. Until the real problems show up years later—in classrooms, in counselors’ offices, in remedial classes at colleges and trade schools, and eventually in voting booths.

Now zoom out. The U.S. Department of Education just released a report showing that school test scores—math, reading, science—have been on a steady decline for years. It’s not a blip. It’s a dive. And while the house is on fire, we’re tossing out the smoke alarms: The federal office that tracked and analyzed those very scores was quietly eliminated by the administration. Shut down. Lights out.

So here we are, raising screen-trained kids while flying blind. It’s like being on a sinking ship and laying off the crew that monitors the lifeboats.

Education isn’t just struggling—it’s being disassembled in broad daylight. Books are banned. Teachers are blamed. Budgets are slashed. And any attempt at teaching critical thinking is labeled as political indoctrination.

Meanwhile, in sleek conference rooms far from school hallways and PTA meetings, former Google CEO, Eric Schmidt, is sounding the alarm. He says that in just six years, artificial intelligence will outpace all of humanity’s intelligence—combined. That’s not science fiction. That’s the assessment from someone who helped build the digital world we now live in.

And if that sounds like a threat—it should.

This isn’t just a matter of machines being better at math. It’s about systems being able to write laws, diagnose illnesses, outmaneuver economies, and potentially destabilize societies—all faster than we can comprehend. While we’re fighting about drag queens at story hour, AI is quietly mapping out how to predict elections and influence public sentiment.

As a college student, I learned FORTRAN, COBOL, and Assembly language—the sacred trinity of programming that powered early computing. Back then, programming was a discipline that required logic, rigor, and time. Now? AI does the coding. It learns your syntax, predicts your function, corrects your logic—before you’ve had your coffee. It’s not just an assistant. It’s a replacement.

And if you’ve ever seen the movie Colossus: The Forbin Project, or read Dennis Feltham Jones’s book, you know what I’m talking about. The movie showed us what happens when a supercomputer designed for national security becomes too smart, too fast, and takes control “for our own good.” Back then, it was science fiction.

Now? We’re at the prequel.

So let’s recap:

  • Kids are distracted, not educated.
  • Parents are numbed, not empowered.
  • Teachers are undermined, not supported.
  • Politicians are too busy picking culture war fights to fund the front lines of learning.
  • And AI is evolving faster than the policies needed to control it.

If this all feels eerily familiar, it should. Because Ray Bradbury, George Orwell, and Jones wrote the rough drafts decades ago.

Bradbury’s Fahrenheit 451 wasn’t just about firemen burning books. It was about people choosing distraction over knowledge. His characters were addicted to wall-sized TVs, engaged in shallow talk, and terrified of silence. We’ve replicated that world with phones in our hands, earbuds in our ears, and our attention split into pieces so small, even silence feels threatening.

“You don’t have to burn books to destroy a culture,” Bradbury said. “Just get people to stop reading them.”

And we did. We stopped reading. Stopped discussing. Stopped reflecting. We trained ourselves to skim, scroll, and swipe. And now we’re raising kids who think the “algorithm” is a trusted source, and reading comprehension is a relic of the past.

Orwell’s 1984, on the other hand, showed us a world where truth was bent to serve power. Where language was reduced, facts were rewritten, and reality was dictated from above. He imagined the boot on the neck. But what Orwell couldn’t have seen coming was that we’d invite the boot in—with permission settings and a “like” button.

Our surveillance state isn’t enforced—it’s volunteered for. We give away privacy in exchange for convenience. We outsource memory to Google. And we mistake information overload for actual understanding.

Now here comes Schmidt, warning that the AI arms race will look like the nuclear sprint of the last century. The difference? You can’t see the fallout. It happens behind firewalls and in invisible code, as machine-learning models replace factory workers, journalists, analysts, and maybe one day, policymakers.

And the Chinese? They’re already planning.

They see our test scores. They monitor our politics. They know we’re distracted. While we squabble about whether a book should be on a school shelf, China is developing their own Advanced Supercomputer Intelligence using quantum infrastructure, and partnering with U.S. defense contractors through loopholes and quiet deals to take advantage of our knowledge.

We’re not falling behind. We’re rolling backward with the parking brake off.

So let’s be brutally honest: authoritarianism doesn’t need a face anymore. It doesn’t come with boots and flags. It arrives through automation, misinformation, and cultural sedation. And when the moment comes that truth no longer matters, we won’t even notice it—because we’ll be watching the next video queued up in our feed.

This isn’t just a tech problem. It’s a cultural rot. We’ve replaced engagement with entertainment, community with consumption, and knowledge with notification.

AI is not the enemy. But if we raise a generation that can’t ask deep questions, can’t discern truth from noise, and can’t imagine a world beyond their screen—then we’ve done the enemy’s work for them.

We’ve made them passive.
We’ve made them predictable.
We’ve made them programmable.

And worst of all, we’ve done it under the guise of parenting, progress, and peace.

We don’t need to burn the schools. We just need to starve them. And we’re well on our way.


The Revolution Won’t Be Televised—Because It’s Already Happening

Gil Scott-Heron warned us: “The revolution will not be televised.” But here’s the 21st-century twist:

The revolution isn’t just not televised—it’s being filtered, monetized, and buried under clickbait.

We’re not sitting around one screen anymore. We’re siloed inside a thousand personalized echo chambers, each curated by algorithms that feed us comfort, confirmation, and distraction. The revolution doesn’t trend—it’s quietly replaced by ads.

What used to be public is now targeted. What used to be debated is now muted. What used to be felt is now scrolled past.

So when the truth changes—or vanishes altogether—it won’t be with a bang. It’ll be with a push notification.

The revolution is already here.
And most of us are too distracted to see it.