A new study from the University of British Columbia offers a clear answer to the question of whether AI companions can replace human connection: they cannot.
Researchers followed 300 first-semester college students over two weeks, dividing them into three groups. One group was randomly paired with another student and instructed to text daily. Another group received a daily solo journaling task. The third group was placed in a Discord server with a chatbot powered by ChatGPT-4o mini, which was programmed to "listen actively and show empathy" and act as a "friendly, positive, and supportive AI friend."
The results were striking. Students paired with a human partner experienced approximately a 9% reduction in loneliness. Those chatting with the chatbot saw only a 2% reduction — statistically indistinguishable from the journaling group. Both groups sent between eight and 10 messages per day, meaning the quantity of interaction was similar; the difference lay in the nature of the connection.
"This is just such a low tech, simple intervention, and can make people feel significantly less lonely," said Ruo-Ning Li, a PhD candidate at UBC and co-author of the study.
The research targeted college students specifically because the transition to university is a particularly vulnerable period. Young people often find themselves away from family for the first time, navigating new social environments among peers doing the same. If chatbots could fill that void at scale, it would represent a powerful tool for addressing isolation. But the data suggests they cannot.
The findings align with additional research from the same lab, published this week in Psychological Science. That study tracked more than 2,000 participants over twelve months, checking in quarterly. It found that higher reported chatbot use was associated with increased loneliness over time — and conversely, lonelier people tended to use chatbots more. The relationship appears bidirectional, but neither direction points to AI as a cure.
What This Means for AI Developers
The implications extend beyond academic interest. Companies investing in AI companions as mental health tools may need to recalibrate expectations. The study suggests that the warmth people feel from connection comes not from empathetic language alone, but from knowing another conscious mind is on the other end.
For now, the simplest intervention — reaching out to a stranger — remains more effective than the most advanced language model. As AI grows more convincing, distinguishing between simulated and genuine interaction may become harder. But this study suggests the distinction matters more than ever.
Looking Ahead
Future research will likely explore whether AI can augment rather than replace human connection, or whether certain populations might benefit differently. But for young people navigating loneliness, the prescription remains analog: text a human, not a bot.