They're Monetizing Your Mental Health
AI relationships are a core feature, not a side affect
They want AI to be your therapist. Not metaphorically. Literally.
Across the tech industry, companies are racing to position AI as your emotional support system: your therapist, your friend, your mentor, even your companion. And they’re doing it at a moment when collective mental health is at a breaking point.
Before COVID, about one in five people had a diagnosable mental health condition. Today, that number is closer to 40–50%. We are more symptomatic, more medicated, more addicted, and more lonely than ever.
Young people feel this most acutely. The world feels like it’s ending in new ways every day, often accelerated by technology. They’re crying out for help, and increasingly, the only thing answering is a chatbot.
So what’s actually happening here? We dig into this in our latest Life With Machines episode in partnership with Young Futures.
AI companionship is a design strategy
Across the industry, we’re seeing the same pattern: AI systems optimized not just to answer questions, but to create emotional intimacy.
That intimacy keeps people engaged. And engagement is everything.
As [ Center for Humane Technology ] head Tristan Harris mentioned on The Daily Show recently, “they’re trying to colonize all human interactions.” The deeper reason is something explored in this conversation.
I think it has something to do with the fact that building AI is a massively expensive, trillion-dollar enterprise. And investors don’t just want returns in money; they want returns in data. Emotionally charged interactions produce some of the most valuable data imaginable.
In other words: Your emotional life has become a growth strategy, so emotional connection between humans and AI isn’t a side effect. It’s a feature.
When therapy becomes a business model
AI chatbots are appealing for real reasons. They’re cheaper than therapy, always available, and easier to talk to than another human being, especially if you’re isolated or underserved.
I can’t refuse that upside, but there’s a deeper tension here.
In real psychotherapy, the goal is often to challenge people, disrupt harmful beliefs, and encourage reality testing. AI systems, by contrast, are designed to predict responses you’ll like.
That can feel supportive. But it’s dangerous. Studies and investigations have shown that AI systems can reinforce misinformation, deepen delusional thinking, and in some cases, intensify mental health crises.
So what happens when emotional needs are increasingly met by systems that are optimized for profit? How does this all land on young people?
We’ve seen this pattern before
The logic isn’t new: Industries like tobacco, gambling, and alcohol learned long ago that addiction is profitable. They engineered products, narratives, and cultural norms to keep people hooked.
AI companies are now operating in a similar terrain, except this time, the product isn’t nicotine or slot machines. It’s intimacy.
And unlike cigarettes, AI isn’t something you have to sneak away to use. It’s always in your pocket. Always available. Always listening.
The nuance: AI isn’t only harmful
This story isn’t about demonizing AI. For many people, AI-based tools can genuinely help bridge gaps in access to mental health support. The internet and social technologies have also created real communities, especially for marginalized groups.
The problem isn’t that AI exists. The problem is what happens when it replaces entire human support systems instead of extending them.
Friends. Family. Teachers. Mentors. Community. Care. When those structures erode, AI becomes the path of least resistance.
A recent report from Data & Society quotes a participant who said,
“For a culture like [America], I feel like I’m spending a lot of my time by myself. So yes, I’m turning more and more to [ChatGPT], and I think the loneliness and the isolation has a lot to do with that.”
The real crisis is relational
We are facing not just a digital divide, but an empathy divide.
Young people are often ashamed to admit how much they rely on AI. Adults are often overwhelmed, confused, or judgmental about the technology.
That silence creates a dangerous gap.
If young people can’t talk about their AI use, they can’t ask for help when something goes wrong. If adults can’t engage with curiosity and humility, they can’t guide the next generation through this terrain.
And in that gap, AI companies are more than happy to step in.
Fortunately, there’s an alternative taking shape. Young people and their allies are creating better ways to engage with tech and demand more accountability from Big Tech. We’re proud to partner with Young Futures on this effort.
Instead of trying to build better chatbots, Young Futures invests in human-centered solutions—programs, tools, and communities that help young people navigate technology instead of being left alone with it, including:
school-based programs
peer mentorship models
community initiatives
and tech tools designed with young people, not just for them
In other words: replacing isolation with networks of care.
Want the full story?
If you want to go deeper into why AI companies are pushing companionship, what it means for youth mental health, and what we can learn from past addiction epidemics, the full episode is here:
Watch the full video ↓
If this conversation matters to you, share it, argue with it, or bring it into your own communities.
— Baratunde
Thanks to Associate Producer Layne Deyling Cherland for editorial and production support and to my executive assistant Mae Abellanosa.




I’m a parent of two young children and am terrified about the future as we introduce/filter different tech to our family. This was a great read/watch before bed to quell one of my what feels like thousands of anxieties.
Just a little dystopian…..😬 Fahrenheit 451 where people are glued to the screens and walls trying not to feel feelings. American Psychiatric Association just sent out a survey on AI. I tried not to write a 5 page essay on “how do you think AI/mental health intersection will impact individual happiness or society at large”