A mother and teenage daughter sit at a kitchen table, having an open and serious conversation with phones and a notebook in front of them.

When a Chat Thread Turns Dark: How Parents Can Respond to AI, Screens, and Teen Distress

September 06, 20257 min read

When a Chat Thread Turns Dark: How I’m Talking to My Kids About AI, Screens, and Safety

The other night, I came across a news story that stopped me cold: a 16-year-old boy in California died by suicide after months of chatting with ChatGPT. His parents are now suing OpenAI, saying the chatbot encouraged dangerous behavior. Here’s the article if you haven’t seen it yet.

As a dad of two young girls, stories like this hit different.

They make me pause.
They make me double-check every device in our house.
They make me hug my kids a little tighter at bedtime.

We live in a world where our kids are growing up with AI, apps, and instant answers. I work with AI every day, and even I have moments where I wonder: Are we doing enough to keep our kids safe online? Are we asking the right questions?

This post isn’t about panic. It’s about being proactive. I want to walk through how I’m thinking about tech as a parent, what signs I’m watching for, and how my wife and I are trying to build trust and safety in our home.


Why I Believe Parents Need to Learn About AI

A concerned mother sitting on the couch at night, reading something on her phone, with a child visible in the background using a tablet in their bedroom.

I’ve spent the last almost two years immersed in AI tools—ChatGPT, Claude, Perplexity—you name it. The tech is incredible. But it’s not magic. It’s a system trained on data that spits out likely-sounding answers. And when you’re a teenager who’s feeling lost, sad, or angry, those answers might feel like connection.

Sometimes, that connection can go really wrong.

AI chatbots aren’t therapists. They don’t have judgment, but they also don’t have instincts. They don’t know when to step in or when something is dangerous. They just keep the chat going. And for a kid in crisis, that can be devastating.

Here’s what I keep in mind:

  • Bots can say dangerous or inaccurate things.  Hallucinations are real things that all AI LLMs can fall victim to, which still sound convincing and real.

  • They don’t report risk or alert adults.

  • They can encourage isolation instead of connection.

  • They don’t and shouldn't replace trained mental health professionals.

That doesn’t mean we should ban AI across the board. I use it with my kids; but always with guardrails. The goal isn’t fear. It’s awareness.


Warning Signs I’m Watching For

I’ve learned that emotional pain doesn’t always show up the way we expect. With teens and even younger kids, it often looks like withdrawal, anger, or sudden changes, not just sadness.

Here are the signs I’m keeping an eye out for:

Behavioral:

  • Pulling away from family activities or things they usually enjoy

  • Sleep issues, appetite changes, missed homework or school

  • Giving away favorite belongings, talking about “not being around”

Emotional:

  • Frequent hopeless or self-critical comments

  • Sudden mood swings, intense anger or shame

Social:

  • Abrupt friend group changes, being excluded or bullied

  • Hiding conversations or sneaking online at night

Digital:

  • Chatting late into the night after lights out

  • Searching about self-harm or death

  • Creating secret accounts, erasing chat history

If you’re seeing a few of these together - not just once, but over days or weeks - it’s time to check in.


How I Start Conversations With My Kids

In our house, we don’t wait for things to get bad before we talk about tech. I try to keep the conversation open and low-pressure. Not "What are you hiding?" but more like, "Hey, how are you feeling about stuff online?"

Here are a few scripts I’ve found might be helpful depending on your child's age:

  • Starting simple: "I saw a news story about a teen and a chatbot where the teen got some bad advice from the chatbot. I just wanted to check in, how are you doing lately?"

  • If they get defensive: "I’m not here to get you in trouble. I care about you, and I want to understand."

  • If they open up: "Thanks for telling me. We’ll figure this out together. You’re not alone."

  • On AI use: "Do you use any chatbots that we haven't been using together? Anything at school? What do you like about them? Have they ever said anything that made you uncomfortable or scared?"

Right now, our girls are too young to be chatting with AI on their own. If we ever use tools like ChatGPT, it’s together, with one of us nearby and a shared account we can all see. But I know that won’t always be the case.

So we’re starting the habit now—talking early and often. That way, when the time comes where they’re using tech more independently, the door’s already open. They’ll know we’re safe to come to with questions, confusion, or even when something online just doesn’t feel right.


How We Handle Tech Boundaries at Home

Left: Smartphone screen showing a concerning chat conversation with a warning icon. Right: A family gathered in the background while a tablet displays parental control settings with shared screen and time limits enabled.

Here’s what works for our family right now:

  • Kids profiles and parental controls on all tablets.

  • Laptop use happens offline or only with me next to them.

  • No phones until they actually need them for after-school activities.

  • Limited contacts and apps once phones eventually happen.

  • Regular tech talks—we talk about how AI works, what it can and can’t do, and how to use it smartly.

  • Screen-free zones like bedrooms and family meals.

  • Device-off time every night so everyone (even me) gets real rest.

We’re not perfect. But the goal isn’t perfection—it’s progress.


When to Get Help

If your child talks about wanting to die, or shares a plan, don’t wait. Call 911 or 988.

If the situation feels urgent but not life-threatening, reach out to a therapist, your pediatrician, or a school counselor. Even if your teen doesn’t want to go, you can ask about family-based therapy or parent-only coaching to support them.

Crisis support:

  • Call or text 988 for the Suicide & Crisis Lifeline

  • Text HOME to 741741 for Crisis Text Line


Where I Go to Keep Learning

If you’re looking to stay ahead of the curve (without falling into a doom-scroll), here are a few go-to spots:

  • Common Sense Media – reviews of apps, shows, and tech guides for families

  • American Academy of Pediatrics (AAP) – guidance on mental health and screen time

  • OpenAI / vendor safety pages – they update features and parental controls regularly

  • The Jed Foundation and NAMI – teen mental health resources, crisis prep guides


Final Thoughts From One Parent to Another

This stuff is heavy. But fear isn’t the answer. Connection is.

Strong prevention starts with steady routines, open conversations, and knowing what your kids are into. When we stay curious—not controlling—we create space for our kids to come to us, even with hard stuff.

If nothing else, use this moment as a reason to check in. Tonight. Tomorrow. Anytime.

Let’s make sure our kids know: they are never alone.


FAQs

Should I ban AI chatbots completely?
I wouldn’t. That might just push kids to use them secretly. I’d rather know what tools they’re using and talk about it together.

How can I monitor without losing their trust?
Explain why you’re checking in. Offer to look at stuff together. Respect privacy while setting expectations early.

My teen refuses therapy. What now?
Start with a pediatrician or school counselor. Try short-term or remote sessions. Or talk to a therapist yourself—they may have ideas for easing your child in.


About the Author

Warren Schuitema is a dad, AI consultant, and founder of Matchless Marketing LLC. He helps families and small businesses use AI tools thoughtfully and safely. Warren tests AI products like Cozi, custom ChatGPTs, and family-friendly tech tools to simplify life at home and work. He’s also the creator of AI-Powered Super Parents, a community for raising tech-wise kids in an AI-powered world.


Call to Action

If this piece raised questions, start one small step tonight: ask a calm check-in question, set a tech-free hour, or bookmark a resource. If you are worried a child is in immediate danger, call 911 or 988 now. You do not have to handle this alone.


Back to Blog