AI Cheating Isn’t the Problem
When scores become the goal, cheating becomes the strategy
TL;DR
We’re not living through an “AI cheating crisis.” We’re living through the consequences of a school system that taught kids to chase scores instead of learning. AI didn’t break education, but it has revealed what was already broken.
Everyone’s Talking About AI Cheating. We’re Missing the Point.
If you skim the headlines, you’d think AI has single‑handedly turned a generation into academic criminals. Students expelled. Teachers panicked. Experts warning about the end of learning as we know it.
But after months of digging into this topic, including reporting for our latest Life With Machines episode in partnership with Young Futures, here’s the simple truth: Cheating is a symptom, not the disease.
The real story isn’t: Kids are cheating with AI. It’s: Why are kids turning to AI in the first place?
In this newsletter, we’ll dig into that.
5 Things I’ve Learned About Youth, AI, and School
1. Kids Are Overwhelmed by a World They Didn’t Create
Today’s students are carrying:
a climate crisis
an economic crisis
a democracy crisis
a mental‑health crisis
and a school system designed before they were born
Plus hormones. So many hormones.
They’re navigating technology they didn’t ask for, expectations they didn’t set, and systems built by people who aren’t living their lives.
And in this world, sometimes AI isn’t a shortcut. It’s a lifeline.
For neurodivergent kids, those without tutors, or those who’ve been told their whole value lives in their GPA, AI can feel like the only support they have.
The next time you feel the urge to judge a young person for their use of technology, realize they weren’t the ones to design and market that tech, nor were they the ones to bundle the friendships, communication, schoolwork, and entertainment all in that one super compelling interface.
2. We Can Blame George W. Bush (At Least a Little)
When we traced the root of the cheating panic, something surprising emerged: No Child Left Behind. I didn’t see that connection coming. The 2002 law supercharged standardized tests as the metric for success and an education system focused on optimizing test performance is like bait for AI.
If the whole system tells students that score = worth, then of course they’re going to use the most efficient tools to get that score.
In other words, we taught young people to optimize. So they optimized.
3. AI Policies Should Be Built With Students, Not For Them
We found at least two powerful, and wildly different, examples of students shaping the rules of AI engagement.
Exeter’s Zero Tolerance
At Exeter Academy, students pushed for a strict, no‑appeals, zero‑tolerance AI policy. As my friend Poonam Sharma (member of the Exeter Alumni board and host of The Release Podcast) told me:
An overwhelming portion of the student body that said, if you are proven to have cheated using AI, we the student body want you to get a confirmed zero on that assignment. Which you have to understand in context at a place like Exeter, where kids take their work so seriously and they take the competition academically so seriously… I was so impressed to hear that teenagers were that concerned and that steadfast in the idea that their effort needs to still be counted.
The students wanted fairness and clarity, and the old-school academic values. What they did not want was to erase the performance differential that comes from actual hard work using their meat brains. And as a product of an elite educational institution myself, I get it.
Here’s how Sharma put the cost of introducing AI and tech too soon in a young person’s life:
BTW, if you want more exploration of how parents are living with technology, including with their kids, check out The Home Screen Substack of my friend .
Students + Teachers Setting Rules Together
On the other side, the platform Graidients helps classrooms co‑design norms around AI.
Here’s how they describe it on their site:
Graidients helps you have conversations with your students about AI use. AI Explorers’ Club helps you prepare for those conversations. It will help you ground those conversations in ethical perspectives before any AI use, and support transparency, disclosure, and reflection after assignments are submitted. Think of it as a community of practice — a space where you can connect with fellow educators, share experiences, and tackle these AI challenges together.
Theme across both? Young people want clarity. They want fairness. And they want a voice.
As the mantra goes: Nothing about us without us.
4. AI Can Help Teachers Too
Teachers are also in a bind, pulled into the optimization game with too few resources then made to react to AI innovation dropping from Silicon Valley without warning.
Teachers are stressed and underpaid; expected to be therapists, social workers, and John-Wick level bodyguards; and still somehow raise test scores. Amidst this reality, we found some tools that can actually help them, not try to replace them.
Tools like Khanmigo (teacher-friendly, classroom AI assistant), Brisk Teaching (lesson-planning help), and ESAI (college guidance help) show that AI can support teachers without replacing them. I’ve talked to teachers using these tools. They’re not looking to outsource their job. They’re looking for help doing the impossible.
5. The Real Problem Is Aiming For “Efficient, High‑Scoring Children”
At the far end of the spectrum is The Alpha School, an AI-powered private school that shows what happens when we optimize students for speed and performance instead of understanding.
When I first heard about it (at the Masters of Scale Summit) I was intrigued. The school offers two hours per day of one-on-one AI tutors to students, and the rest of the day is group projects and coaching. A big promise is that the students will learn more, faster, from the AI tutor. It’s about maximizing the efficiency of learning.
If you missed my essay on Big Tech’s obsession with speed, check it out.
But there’s been amazing WIRED reporting recently that Alpha students are stuck in AI-driven loops, skip lunch to finish repetitive lessons, and essentially perform literacy instead of understanding what they’re reading. Kind of like a human chatbot.
Check out
for more on the Alpha story.We should be valuing: curiosity, creativity, comprehension, communication, and collaboration. Not just completion of assignments. I explore a radical vision of an educational experience that pursues this in my conversation with Abby Falik of The Flight School.
So What Does a Real Solution Look Like?
It’s not as simple as banning AI vs. offloading teaching to AI, but instead, building a middle path that actually works for real humans.
I prefer a model where students and teachers co-create norms, where AI supports learning instead of replacing it, and where we measure more than test scores. This is why I’m excited about what Young Futures is doing.
They’re funding youth-led solutions, not waiting for tech companies or Congress to fix education. They’re investing in young people and those working with them who want to be part of the solution, not just subject to it.
One of their core pieces of advice to adults: be a coach, not a referee. They learned it from the Center for Digital Thriving, and I love it. That shift in posture alone can change the whole conversation.
What Do You Think?
I’d love to hear from you.
Are you a parent who’s seen AI change homework time?
A teacher navigating new pressures?
A student trying to make sense of this system?
Someone who strongly disagrees with me?
Hit reply or jump into comments. We are reading.
🎥 Watch Full Episode
If you want to go deeper down this rabbit hole, we made a full Life With Machines episode about all of this.
the backstory on No Child Left Behind
what happens inside Alpha School
voices from teachers, students, and advocates
and a whole lot more
Thanks to Associate Producer Layne Deyling Cherland for editorial and production support and to my executive assistant Mae Abellanosa.





Fantastic post and the hardest of hard agrees. AI should present educators and administrators with an opportunity to create learning experiences for students that are dynamic, personalised, and authentic. Instead, many of them are blaming AI (which is in essence nothing more than a tool) for the systemic failings of education institutes over the last few decades and a lack of willingness to update from a method of teaching which still feels grounded in the European monasteries of the 1500s...
AI is wonderfully empowering for students when used for actual learning. Teachers and schools need to understand the technology better and not demonize it. I let students use AI as long as they disclose and show the class how and why they used it. I use it and it saves me lots of time.