Skip to content
_
_
_
_

This is what AI ‘ghosts’ will be like: Working after our death, advising our grandchildren, or accidentally revealing an affair

A Google study analyzes the unexpected consequences of using this new technology to reincarnate oneself or loved ones

Inteligencia artificial
Jordi Pérez Colomé

Imagine a hypothetical situation in which a father creates a replica of his deceased son using artificial intelligence (AI). If this ghost were limited to that age, it would always behave like a young child: the same way of speaking, the same level of maturity, the same appearance. But what if it were an evolving version? Would this replica become a teenager or an adult?

This dilemma is just one example of the future that AI-powered generative ghosts could bring, a term coined in a new scientific paper by Meredith Ringel Morris of Google Deepmind and Jed Brubaker of the University of Colorado Boulder. What are the realistic chances of AI being used to generate artificial life after death? “The odds are high, especially now that AI is becoming more powerful, accessible, and common,” says Brubaker. “People are already using AI to preserve voices or mimic deceased loved ones. Online memorials have also become commonplace; generative ghosts could be next, especially as part of memorialization, end-of-life planning, and history.”

Some companies have already seen a commercial opportunity for these ideas. Re;memory allows you to create an interactive virtual version of someone after interviewing and recording them for seven hours. Another example is HereAfter AI, an app that interviews a person with the idea of eventually creating a digital version for after their death. With some differences, both allow you to converse with a kind of chatbot that represents your loved one and which can share photos, voice recordings, and memories from their life.

In some Asian countries, where people have a particular relationship with death and ancestors, some of these practices are already more normal than in other parts of the world. “East Asian countries like China and South Korea seem to be ahead of the curve, in part because their cultural traditions see it as normal to continue having a relationship with their ancestors. In Western countries, on the other hand, its adoption depends more on how each person views technology, death and grief,” says Brubaker.

Chatting about the Pope with a dead person

But generative ghosts can be much more than a simple chatbot, which is no small feat: there will be families who will find comfort in discussing news like the death of the Pope or a more recent event like their daughter’s wedding with someone who has passed away. Another of the next steps in creating AI afterlives is agents, programs that can handle errands for us. In the future, it will not be uncommon to encounter deceased figures who have left their own agents behind. It will no longer be just a chatbot, which could, for example, resolve a dispute over their own inheritance, but artificial intelligences that can perform tasks, even work-related ones, after someone has retired or died. “Generative ghosts might also support family members by providing advice on procedures that they had been responsible for in life (e.g., teaching a surviving spouse how to cook a favorite dish or repair the kitchen faucet). In some cases, income provided by generative ghosts’ participation in the economy might support family members,” the article says.

These types of jobs won’t be so unusual for an AI trained with a specific corpus: “Some might write books, answer questions, or act as virtual advisors, especially if they were experts in life. For example, a teacher’s ghost could continue teaching, or a musician’s ghost could create new songs. It sounds a bit futuristic, but it’s not so crazy if we consider how quickly AI is advancing,” says Brubaker.

The profile of individuals or families who might tend to create these types of avatars is unclear, and there are no studies analyzing it, says Brubaker, but there are already some public cases. “Generally, they are people with a certain level of technological proficiency, concerned about leaving a legacy, or who are facing serious illnesses. Their motivations are often related to emotional needs and connection with others,” he says.

Many foreseeable risks

The section on foreseeable dangers is extensive. “It’s not that there are necessarily more risks, but they are more complex and often not as obvious,” says Brubaker. “The benefits, such as providing emotional comfort or leaving a legacy, are easily understood. But the risks need more explanation to be able to foresee and avoid them,” he adds.

One clear example is the emotional dependence on a machine that represents someone who is no longer there. Although in the article, the authors use the word reincarnation to define it. “It’s primarily a metaphor,” Brubaker warns. “We use it to describe a generative ghost that acts by imitating a deceased person. It’s very easy to imagine an AI that imitates a person’s voice and mannerisms,” he adds.

But there are others that are harder to foresee, such as reputational or privacy issues. An AI programmed with texts or phrases from Grandpa may suddenly reveal that his views were actually racist or derogatory toward certain groups, whether true or not, because these ghosts could suffer from hallucinations like any AI. It could also uncover a past lover or an action that the dead person would prefer never to see disclosed.

Some people could even program their generative ghosts to continue bothering someone from the afterlife. “In addition to ghosts that might engage in post-mortem harassment, stalking, trolling, or other forms of abuse of the living, malicious ghosts might be designed to engage in illicit economic activities as a way to earn income for the deceased’s estate or to support various causes including potentially criminal ones,” the article says.

Sign up for our weekly newsletter to get more English-language news coverage from EL PAÍS USA Edition

Tu suscripción se está usando en otro dispositivo

¿Quieres añadir otro usuario a tu suscripción?

Si continúas leyendo en este dispositivo, no se podrá leer en el otro.

¿Por qué estás viendo esto?

Flecha

Tu suscripción se está usando en otro dispositivo y solo puedes acceder a EL PAÍS desde un dispositivo a la vez.

Si quieres compartir tu cuenta, cambia tu suscripción a la modalidad Premium, así podrás añadir otro usuario. Cada uno accederá con su propia cuenta de email, lo que os permitirá personalizar vuestra experiencia en EL PAÍS.

¿Tienes una suscripción de empresa? Accede aquí para contratar más cuentas.

En el caso de no saber quién está usando tu cuenta, te recomendamos cambiar tu contraseña aquí.

Si decides continuar compartiendo tu cuenta, este mensaje se mostrará en tu dispositivo y en el de la otra persona que está usando tu cuenta de forma indefinida, afectando a tu experiencia de lectura. Puedes consultar aquí los términos y condiciones de la suscripción digital.

More information

Archived In

Recomendaciones EL PAÍS
Recomendaciones EL PAÍS
_
_
OSZAR »