
What if artificial intelligence was able to recreate a memory … which never existed? It is the disturbing experience lived by Alexis Ohanian, founder of Reddit, who used an AI to revive his deceased mother through a video. An intimate, overwhelming, but also controversial gesture. What are the psychological risks of these false memories? Elements of response with psychologist Amélie Boukhobza.
An artificial video, an emotionally real memory
Alexis Ohanian, American entrepreneur and husband of Serena Williams, recently made him talk about him by publishing on X (formerly Twitter) a video generated by artificial intelligence. We see him, child, in the arms of his missing mother. Thanks to the Midjourney tool, he hosted an old photo to reconstruct a scene, which was never experienced: a maternal hug. A moment that is both simple … and deeply overwhelming.
In his message, Alexis Ohanian confesses: “I was not ready for it to touch me so much“.
He admits to having watched the scene “at least 50 times”, fascinated by this found emotion, even artificially.
Reactions shared between wonder and discomfort
Quickly, the video sparked a wave of contrasting reactions. If some greet a moving technological feat, others point to a disturbing drift. Can we really recreate a memory? And at what price?
“”Replace your real memories with false for $ 19.99 per month“, quips a surfer.
Others evoke a reference to the “mirror of Riséd”, the magic object of Harry Potter Who shows what we want the most … without ever making the missing.
The brain faced with artificial memories: how does he react?
For psychologist Amélie Boukhobza, these uses of artificial intelligence ask real questions, far beyond the technique.
“We hear a lot about the dangers of AI, but there is also its more intimate, more confusing uses,” she said.
Because if the scene is fake, emotions can be very real. “”An invented but emotionally credible memory can activate the same brain areas as a real memory“Explains the specialist.
In other words, even a rebuilt scene from scratch can produce sincere relief.
She adds:
“Besides, our defense mechanisms already do it: they rewrite certain memories to make them more bearable. The power of the brain is immense”.
Consolating memories … but not repairers
If an illusion can temporarily appease, it does not replace the lived experience.
“”Creating a memory is sometimes trying to fill a lack. That of a link, of a gesture never received, of a word never said“, Continues Amélie Boukhobza.
But be careful not to confuse consolation and healing. According to the psychologist, these videos can relieve, yes, but they “do not treat the original injury“.
While some companies are starting to offer this type of service to bereaved families, practice is democratized. In China, artificial intelligences go so far as to simulate conversations with the deceased, to “extend the link”.
A mourning tool or an emotional flight?
Faced with this emerging trend, the psychologist warns against unreflected use of these recreated images.
“”Create false memories, yes. Flee the reality of mourning, no“She sums up.
“”I will not go so far as to condemn these practices, but we must remain lucid about what they mobilize“.
In the case of Alexis Ohanian, the video could serve as a trigger, allowing to cry an embrace never received. But the risk would be to seek a permanent substitute there, a form of rewriting of the past.
“”If we try to replace memory rather than facing what has been – or not – the risk is to freeze the mourning process instead of accompanying it“, She concludes.