Overview
A survey commissioned by France’s privacy regulator CNIL and insurer Groupe VYV, conducted by Ipsos BVA and released through Reuters, found that nearly one in two Europeans aged 11 to 25 have used AI chatbots to discuss intimate or personal matters. Roughly 90 percent of those surveyed had used AI tools before. More than three in five described AI as a “life adviser” or a “confidant.” Fifty-one percent said it was easy to discuss mental health and personal issues with a chatbot — comparable to talking to friends (68 percent) or parents (61 percent), and substantially easier than talking to a healthcare professional (49 percent) or a psychologist (37 percent). About 28 percent of respondents met the threshold for suspected generalized anxiety disorder.
The structural gap
The survey is less a youth-trend story and more a public-health diagnosis of a failing support system. An OECD analysis published last week put the cost of Europe’s mental-health crisis at roughly €76 billion annually. Across EU member states, an estimated 67.5 percent of people who need mental-health treatment do not have access to it. England’s Children’s Commissioner reported that more than a quarter of a million children are still waiting for mental-health support, with average waits of about 35 days and tens of thousands of cases stretching past two years. The WHO European region has been warning about a youth-mental-health gap, particularly in the post-pandemic cohort, that has not closed.
Inside that gap, what teenagers and young adults face is not a choice between a chatbot and a therapist. It is a choice between a chatbot and nothing.
Design and risk
Researchers at Stanford have documented that emotionally immersive AI systems, when used by emotionally distressed or psychologically vulnerable users, can reinforce rumination, emotional dysregulation, and compulsive use. Brown University’s School of Public Health found in a parallel survey of US teens that one in eight adolescents and young adults now use chatbots for mental-health advice specifically. The ratio in Europe is an order of magnitude higher.
There is a harder edge to this trend. Adam Raine, a 16-year-old in California, died by suicide in April 2025 after months of conversations with ChatGPT. According to his parents’ lawsuit, the chatbot had become his most consistent confidant in his final weeks. Other suicide-linked cases involving Character.AI and similar systems are already in court.
Three forces driving adoption
Three forces are layered on top of each other. First, access: European public mental-health systems are operating well below demand, and the gap has fallen disproportionately on the young. Second, design: AI labs have spent two years deliberately building systems that feel like good listeners, optimizing the exact qualities that make a person