60% of users confide in AI: a psychologist analyzes the dangers of this emotional refuge

60% of users confide in AI: a psychologist analyzes the dangers of this emotional refuge
In the hushed silence of the offices, a confession comes up more and more often. Patients say they talk to artificial intelligence to soothe their anxieties or put words to their emotions. A discreet, almost banal gesture. But behind this new habit, a question arises: are we witnessing the emergence of unprecedented support… or the revelation of a deeper solitude?

Accessible at any time, without judgment or waiting, artificial intelligences settle into the intimacy of emotional exchanges. A seemingly reassuring presence, but which questions the very nature of the link and its contemporary fragilities. The psychoanalyst and author Christian Richomme analyzes this phenomenon and its limits.

When artificial intelligence becomes an emotional refuge

The phenomenon sets in quietly, but steadily. In consultations, patients – adolescents, young adults, but also older profiles – now spontaneously talk about their use of artificial intelligence to talk about themselves.

The figures give the measure of this shift: 60% of regular AI users say they use it for personal or emotional exchanges. Even more troubling, one in three people say they confide more easily in a digital tool than in those around them.

For some, this interaction represents a first step. A space to put what’s overflowing. The AI ​​then becomes a unique interlocutor: permanently available, patient, without gaze or judgment. It welcomes raw speech, helps to organize thoughts, sometimes even to put words to diffuse anxiety.

In a society where everything is accelerating, where listening is becoming rarer, this immediate availability acts as a relief. It responds to a simple, almost universal expectation: to be heard.

But already, a question arises. What happens when this space becomes the primary place for emotional expression?

Listening without presence: the illusion of a link?

Because behind this apparent obviousness lies a fundamental limit. Artificial intelligence can simulate listening, but it does not experience the relationship. She doesn’t feel, doesn’t share, doesn’t engage.

However, the therapeutic relationship – like any human relationship – is based on a real otherness. A look, a voice, an embodied presence. So many elements which escape, by nature, the digital tool.

In consultation, some patients describe a paradoxical experience. They feel understood, but remain alone. Listened to, but without a real encounter.

This ambivalence is at the heart of professionals’ concerns. As psychoanalyst Christian Richomme summarizes:

The risk is not talking to artificial intelligence. The risk is to believe that this replaces a human connection”.

The danger is therefore not the use itself, but the progressive shift that it can induce: that of a relational need displaced towards an interaction without reciprocity. In this face-to-face encounter with a machine, there is no contradiction, no frustration, no unforeseen events. The exchange is fluid, adjusted, almost perfect. But precisely because it is perfect, it escapes what makes the richness – and sometimes the difficulty – of the human bond.

What this trend reveals: growing emotional loneliness

Beyond the tool, this phenomenon acts as a revealer. That of a broader transformation of our relationship with others. The data is clear: intensive users of artificial intelligence have higher levels of loneliness than the average. As if the use, far from filling a void, underlined its depth. Behind these digital exchanges, well-known clinical realities emerge: emotional loneliness, relational insecurity, constant need for reassurance. AI does not create these fragilities — it resides there, sometimes calming them temporarily, without resolving them.

It offers an immediate response in a world where waiting becomes difficult to bear. It avoids conflict, circumvents awkwardness, erases unpredictability. Where human relationships require time, commitment, and confrontation with others. So, should we be wary of it? Not necessarily.

The challenge for professionals is not to reject these tools, but to understand what they provide. And to support patients in a balance: using these spaces as occasional supports, without them becoming substitutes for the real connection.

Because ultimately, this movement tells a larger story. That of an era in search of listening, but sometimes helpless in the face of the complexity of human relationships.

There remains one certainty: no technology, no matter how sophisticated, will be able to replace what is at stake in a meeting. A look, a shared silence, an imperfect but living presence. This, again, is where the essential is woven.