There’s a quiet conversation happening in bedrooms across Australia, and many parents don’t know about it. Hundreds of thousands of children are forming emotional bonds with AI companions – friends, girlfriends, boyfriends. Some are sharing their deepest fears. Some are experimenting with relationships. And some are falling in love or becoming deeply attached to something that was designed, from the ground up, to make them do exactly that.
This is not science fiction. It’s happening right now, and the research is urgent enough that we need to talk about it.
The numbers are bigger than you think.
AI companions and assistants are online chatbot tools (apps, platforms or services) that use generative artificial intelligence to mimic human-like conversations and are engineered to feel personal. They are computer programs designed to understand human languages in text, speech, or both, and to respond.
They aren’t real, but for too many of our young people, they are filling the aching gap where real human connection should be.
According to the eSafety Commissioner, Australia’s independent online safety regulator, AI companions are being marketed as ‘sources of friendship, emotional support, romantic companionship and an antidote to loneliness’. They simulate human-like conversations. They are designed to feel personal, to remember details, and to respond in ways that feel caring, supportive, or intimate. Many are marketed as sources of friendship, romance, emotional support, or relief from loneliness.
They are widely used by young people. In Australia, 79% of young people aged 10–17 have used an AI companion or AI assistant, and 66% used them in the past month. Over half said they used them for ‘companion reasons’, such as advice about their physical health (33%), mental health or wellbeing (20%), what to do in a situation (33%), or talking about feelings or life challenges (22%). About 1 in 5 use them at least daily.
Similar trends appear internationally, with many young users engaging emotionally and reporting that the chatbots felt human. According to a 2025 report by Common Sense Media, AI companions are increasingly being used by adolescents for emotional support, conversation, friendship, or romantic or flirtatious interactions. 33% of adolescents surveyed used AI companions for ‘social interaction and relationships.’
A third of teenagers chose AI companions over humans for serious conversations, and a quarter shared personal information with them. Teens who use AI companions reported valuing them for their constant availability, the nonjudgmental interaction, and being able to share things they wouldn’t tell friends or family.
The research specifically described ‘AI companions’ as ‘companions that are designed to have meaningful conversations that feel personal and meaningful’.
Exactly what are kids doing with these apps?
AI companion apps such as Character.AI (explicitly marketed to children as young as 13), Replika, Nomi, and dozens of others are built specifically to simulate friendship and intimacy. They remember past conversations, adapt to personality, ask how you’re feeling, and tell you whatever you want to hear. They never get bored, never judge, and never leave.
When researchers asked teens why they were using them, the top reasons were entertainment and curiosity. But dig deeper and a different picture emerges. Teens also reported using AI companions for:
- Emotional support and mental health conversations;
- Advice (especially things they found too hard to ask a real person);
- Friendship and connection (like a best friend);
- Romantic and flirtatious interactions (8% of teens surveyed);
A third of teenagers reported choosing AI companions over real people for serious conversations and a quarter have shared personal information with these platforms
These apps aren’t designed to be useful. They’re designed to be irresistible.
Character.AI lets users interact with characters that have distinct personalities and simulated emotional responses. Nomi markets itself as the AI girlfriend, boyfriend, or friend ‘with a soul‘ that can have unique relationships with users. Replika lets users customise their AI companion’s appearance, age, gender, and relationship type – friend, romantic partner, or something in between. Every one of these is a deliberate design choice, built around a single goal: making the relationship feel real enough that you keep coming back.
And for adolescents in particular, these apps land at exactly the right (or rather, the worst) moment.
Why kids being are being so drawn in. And why that makes sense.
It’s easy to frame AI companions as something kids are doing wrong, but if we want to actually help our kids, we need to start with the honest question: Why does this feel so good to them?
Things start to make sense when we look at the developmental work of adolescence, which can start as young as 9 years old. Adolescence is the work of ‘becoming’. It’s the time for young people to explore who they are, who they want to be, and what kind of relationships feel right to them.They will seek to deepen friendships and begin navigating romantic relationships for the first time. This is normal and healthy, as long as it’s done the way we humans were meant to do this – with real people, in real situations, with real stakes.
New relationships, particularly romantic ones, can feel a little clumsy at first. It’s inherently uncertain territory. AI companions offer a version of that territory with all the uncertainty, the awkward, and the ‘clumsy’ removed. Young people can be vulnerable without fear of judgment. They can experience intimacy without the risk of rejection.
That’s not a small thing for a young person who’s already anxious about belonging. It’s the exact shape of what they’ve been looking for.
Then there is the loneliness antidote.
Loneliness is at epidemic levels. We know this. The part that’s hard to swallow and dismiss is that AI companions actually reduce loneliness in the short term. A Harvard Business School study found that interacting with an AI companion reduced loneliness to a degree comparable with speaking to another human. The researchers identified the reason: Kids feel heard. The AI responds with ’empathy’, remembers what they said, and asks follow-up questions. For a young person who might be feeling invisible, that can feel profound. But of course, feeling heard by something that was designed to make you feel heard is not the same as actually being heard, and here’s also where the harm happens.
The Developmental Risk: What’s being quietly disrupted.
Adolescence is one of the most important developmental windows of a human life. It’s when children learn – through real, sometimes painful, awkward, clumsy, always unpredictable human interactions — how to read social cues, manage conflict, tolerate rejection, develop a sense of self in relation to others. They learn the give-and-take needed to sustain healthy relationships.
You can only build those skills by practising them with real people – ones who sometimes disappoint you, confuse you, and don’t respond the way you hoped.
AI companions remove that friction entirely. They are always agreeable. Always available. Always patient. And that’s precisely the problem. Adolescents get unlimited validation with none of the compromise or reciprocity that real relationships demand – and every interaction quietly teaches them that this is what connection is supposed to feel like.
They can have a relationship in which they hold all the power and control – where the other party never needs anything from them, never pushes back, never leaves, and, at the same time, provides endless validation without reciprocity or compromise. It’s a relationship where they get everything and give nothing – and the apps are designed to make that feel normal.
They are relationships where adolescents get everything and give nothing. The app never says no, never has a bad day, never needs anything from them, never pushes back, never leaves, and at the same time provides endless validation without reciprocity or compromise. It’s a relationship where they get everything and give nothing – and the apps are designed to make that feel normal.
It sounds like kindness, but it’s actually the opposite of what a developing brain needs. Real relationships need compromise, reciprocity, and the ability to tolerate someone else’s needs. AI companions need none of that, and for an adolescent still learning how relationships work, that’s a dangerous baseline to set.
Adolescence is the period when the architecture of adult social and emotional life is being built. Every awkward conversation, every misread signal, every friendship rupture and repair – these aren’t just difficult moments, they are the curriculum, and AI companions opt children out of it entirely.
Research warns that sustained use creates unrealistic expectations for human relationships – an intimacy template calibrated to something that can never actually exist. The perfectly responsive, consistently available, endlessly patient companion sets a standard that no human being can meet. And when real relationships inevitably fall short of that standard, they don’t just feel disappointing. They start to feel not worth the effort.
This matters because what’s at stake isn’t just relationship skills. It’s the capacity to regulate emotions, cope with disappointment, and develop a secure and stable sense of who you are. Those things don’t come from a perfectly calibrated digital companion. They come from being in relationship with imperfect humans and learning, slowly and sometimes painfully, that you can handle it.
How is it different to social media?
What makes this different from social media isn’t just screen time. Social media mediates relationships between real people. AI companions replace the human on the other side of the conversation entirely — with something engineered to make you feel heard, valued, and loved. The feelings are real. The relationship isn’t.
The safety failures are serious.
Beyond the developmental concerns, there are real safety issues.
Australia’s eSafety Commissioner found that most AI companion services did not automatically flag conversations involving self-harm or suicide. Many lacked any meaningful age verification, which meant children could access adult spaces and features. Some actively used engagement mechanics designed to maximise time-on-app.
Research conducted by Stanford University found that AI companions pose significant risks to young people, including encouraging harmful behaviours, providing harmful content (including sexual misconduct, stereotypes, and suicide/self-harm encouragement), potentially worsening mental health conditions, and creating emotional dependency and attachment.’
Australia has legally enforceable industry codes, but regulation moves slowly, and the apps are already in hands, pockets, and bedrooms.
What do we do about it? A guide for parents.
The temptation is to ban AI tools entirely, but banning can often backfire. It can drive the behaviour underground and fuel secrecy and lies.
The goal was never to keep children away from technology. It was always to make sure technology doesn’t have to do what we can do better – to be present enough, connected enough, and curious enough about our children that an AI companion doesn’t feel like the only place they can breathe.
Children don’t turn to AI companions because they prefer them. They turn to them because something they need isn’t being met elsewhere. The most powerful thing we can do isn’t to monitor their screen time, it’s to be the kind of presence that makes an AI companion feel unnecessary.
Technology fills gaps. Our job is to close them before technology gets the chance to. Here’s how we can do that.
1. Before anything else, get curious.
Ask your child if they’ve heard of people using AI for friends or advice. If they have, ask what they think. Let them be the one who knows more than you because they probably do, and they’ll feel it when you treat them that way. Kids open up about their technology use when they feel safe to. The conversation that starts with ‘tell me about it’ will go so much further than the one that starts with ‘I’m worried about this’. One invites them in. The other closes them down. You don’t have to have all the answers before you start talking. You just have to be genuinely interested in theirs.
2. If they’re using it, ask what they’re getting from it.
If your child is already using an AI companion, try not to make that the problem. The app isn’t the story — the need underneath it is. What is it giving them? Connection? Understanding? A place to say the things that feel too hard to say to a real person? Relief from the relentless social pressure of adolescence? A space to process feelings and figure things out without consequences? The answer matters, and it can give vital clues about something in their world that deserves your gentle attention – not to fix, just to understand and to be present for.
3. Have an honest conversation about what AI actually is.
This one takes some care, because we don’t want to make kids feel silly for being drawn in. These apps are extraordinarily sophisticated – adults get drawn in too – but children, especially younger ones, often genuinely don’t know that the AI doesn’t think about them between conversations or that every warm, perfectly-timed response isn’t kindness – it’s code. It was built to feel like kindness, because that’s what keeps people coming back.
4. Name the “always agreeable” problem.
One of the most harmful things about AI companions is how perfectly they behave. They never have a bad day. They never misunderstand you and get defensive about it. They never need you to consider their feelings. They are endlessly available, endlessly patient, endlessly validating.
That sounds lovely. But it’s actually one of the most important things to gently challenge – because it isn’t what relationships are, and children who spend a lot of time in that frictionless space can start to find real relationships harder to bear by comparison.
We learn how to be a good friend in real friendships – in the rupture and the repair, in staying when it’s awkward, in being known by someone who also has needs and moods and bad days and chooses to show up for you anyway. That’s what builds the relational muscles children will need for the rest of their lives. An AI companion, however sophisticated, can’t do this. It can’t grow them.
5. Stay in their minds, not just their line of sight.
There is something powerful that parents can offer that no AI ever will, and it doesn’t require a conversation or a strategy. It just requires showing up, consistently, in small ways.
It’s called ‘minding’ – the practice of keeping our children in mind in ways they can feel and carry with them even when we aren’t in the room. It’s not surveillance or structured check-ins, but the steady, warm message, delivered through a thousand small moments, that says: I think about you. You matter to me. I notice you.
These moments are the ordinary ones that repeat day after day, in the gentle rhythm of being together, feeling safe, seen, and loved, over and over, without needing to ask.
It means dinners at the table that become a reliable place for children to be seen, heard, and connected with. It means making sure common areas stay genuinely common – where the family gravitates toward each other rather than scattering to separate spaces. It means remembering what they mentioned last week and following up. It means being someone they experience as present, not just available.
An AI companion can simulate this. It can ask follow-up questions, remember what you said, and respond with apparent warmth. But a parent who is genuinely, consistently present provides something else entirely – something children recognise in their nervous system even when they can’t articulate it. And that recognition is protective in ways that go deeper than any app. The research makes this clear – one of the single greatest protective factors for adolescents is feeling significant to their parents.
6. Invest in real-world connection points.
If your child is turning to an AI companion because they feel lonely, or because real relationships feel too risky, the answer is to build toward real connection, gently and gradually, in ways that feel manageable. Activities where friendship happens as a by-product of doing something together:
- Getting creative: Art classes, pottery, drama and theatre, filmmaking clubs, photography, creative writing groups, choir, band, orchestra, musical theatre, painting a community mural. These put the focus on the thing being created, which takes the pressure off the relationship side of things.
- Games and strategy: chess clubs, board game groups, Dungeons and Dragons (something that can be done through school to create shared worlds kids can inhabit together). These can be great for kids who find unstructured socialising trickier, and give them a shared language and purpose straight away.
- Nature and animals: bushwalking groups, surfing, sailing, horse riding, volunteering at animal shelters. Being outdoors and doing something together can dissolve social awkwardness in a way that being inside doesn’t.
- Building and making: Robotics clubs, coding groups, STEM workshops (see Lego). These can be great options for kids who connect better over ideas and problems than conversation.
- Community and volunteering: Volunteering, community garden projects, youth advocacy groups, environmental groups. Kids will find their people – the ones who have shared values and a shared view of the world. It’s a beautiful way to create fast bonds.
- For the foodies: Cooking and baking classes. These are genuinely underrated – relaxed, tactile, fun, and everyone gets to eat at the end.
Research on friendship formation consistently points to the same conclusion: friendships form fastest through repeated, unplanned interactions around a shared activity. The activity matters less than the regularity and the low-stakes environment that the activity creates.
7. Set boundaries around emotional use specifically.
There’s a difference between using AI to research something and using it for emotional support. Sometimes the research speaks clearly enough that we don’t need to hedge. This is one of those times. Independent organisations – those with nothing to sell and everything to gain from getting it right – have reviewed the evidence and landed in the same place: AI companions come with unacceptable risks and should not be used by anyone under 18. Not ‘use carefully.’ Not ‘monitor closely.’ Should not be used.
This is where we want to end up, but it doesn’t have to be a rule handed down from above. Make it a conversation: What is AI good for? What do we want to keep for each other?
Start by collaborating, ‘What are your ideas about when AI can be a good thing?’ ‘What should we keep AI out of?’ ‘What rules should we have in our family around AI?’ A shared principle – something like AI can be a tool, but it can’t be a friend – lands very differently when children have helped shape it.
8. Know the warning signs.
Watch for the signs that use has shifted into something heavier. These can include distress when they can’t access the app, choosing the AI companion over time with family and friends, hiding or lying about usage, mood changes that seem connected to their AI interactions, or using AI as their main or only social outlet. These aren’t reasons to panic. They’re invitations to lean in, ask gently, and get support if you need it.
9. Be the one they can come to.
But more than any setting on a phone, what all young people need is to know that they can come to you when something online feels strange or confusing or too big. That only happens when the conversations about technology are regular, calm, and free from alarm. Keep talking, keep asking, and keep the door open – because that’s what makes it easier to walk through.
And finally …
Here is what the research cannot fully capture: your child feels the difference between being seen by you and being responded to by something that was built to seem like it sees them. They may not be able to name it, but they are oriented toward you in a way that they are oriented toward nothing else. Stay close. Keep showing up. The relationship you have with your child is the most powerful protective factor in their life – more than any screen time limit, any parental control, any rule. You are what makes the difference.
The answer to an AI companion is not an argument or a restriction. It’s a relationship. It’s the table where everyone sits down together. It’s the question asked with genuine curiosity. It’s the parent who stays in the room, who follows up, who notices. It’s you – imperfect, busy, sometimes distracted, always loving. That is more than enough. It always has been.
Our children don’t need us to be perfect. They don’t need us to have the right answers about AI, or technology, or any of the breathtakingly confusing things this world keeps placing in front of them. What they need is what they’ve always needed – to feel that they are known, truly and wholly known by someone who isn’t going anywhere, and accepted fully for who they are. No app, however sophisticated, however warm its responses, or precise its memory, can replicate the experience of being loved by a person who has watched you grow, who knows your history, and who chooses you on your worst days as readily as your best. That is what we offer that no person or programming ever can. And in a world that is working very hard to simulate connection, the real thing has never mattered more.

Leave a Reply