When Yang’s uncle suddenly passed away while working away from home, he hired someone to “resurrect” his uncle to talk to his 90-year-old grandmother.
Yang’s grandmother was frail, so no one wanted to break the news about the uncle’s death, fearing it would shock her. With his experience as an Internet professional, 30-year-old Yang from Nanjing spent 10,000 yuan for AI to create a virtual version of his uncle.
During the call, his mother and relatives stayed away, fearing they couldn’t contain their emotions. Yang suggested that the service speak briefly to avoid detection.
Yang’s grandmother did not notice anything unusual. “Because I have some understanding of AI technology, I could detect it. But if someone knows nothing about that technology, they would easily accept it,” he said.
However, this raises many ethical questions regarding the use of AI to “resurrect” the deceased. Yang stated that there are no satisfactory answers. As someone in the industry, he easily accepts this practice, but outsiders and the elderly find it harder to accept.
Yang’s grandmother during an AI call with his uncle. (Photo: Sixthtone).
Shen Yang, a professor of journalism and communication at Tsinghua University, noted that the companionship of AI can help maintain stability in family life. “If we create an AI version of the deceased, their relatives may still think they are alive and feel that they still have companionship in life,” Shen stated.
However, the professor also warned of potential ethical issues, such as whether individuals were willing to accept an AI version of themselves existing before their death. “People should clearly express their wishes before they die. In the future, we may need to clarify these issues in civil law,” Professor Shen said.
The person Yang hired was Zhang Zewei, the founder of Super Brain, a creative AI content company. Zhang mentioned that he transitioned from teaching AI courses to starting a business in “resurrecting” individuals with AI.
According to Zhang, there was initially no intention to profit from this; he worked for free for the first dozen people who approached him. However, unexpectedly, after nearly a year, it became a tremendous success. He now charges from several thousand to tens of thousands of yuan per case, with a profit margin of about 50-60%.
Zhang screens customer requests to evaluate the authenticity of their needs and only accepts less than half of the orders. As of mid-March 2024, Zhang’s studio had received nearly a thousand requests, generating millions of yuan in revenue.
Zhang’s studio is the first to establish a business for “resurrecting” the deceased using AI in China and remains the largest company in this sector to date. Among Zhang’s clients, cases like Yang’s are typical, involving situations where a family member has died or been imprisoned and needs to conceal this from younger and older family members.
In addition to AI video call services, Zhang also offers role-playing games with the deceased aimed at healing psychological trauma. Last year, Zhang received a request from a woman wanting to say her final goodbye to her boyfriend, who had suddenly passed away two years earlier.
“We had dinner together just two days before he suddenly passed away. It’s been two years, but I still can’t move on,” the woman said. “I’ve tried therapy and even done some extreme things. The life we planned together was suddenly and completely shattered. I wanted to have a conversation with him to say goodbye properly. Perhaps after that, I could restart my life.”
When the video call began, the woman saw her boyfriend’s face and couldn’t hold back her tears.
Currently, the fan club of deceased Hong Kong singers Leslie Cheung and Coco Lee is discussing with Zhang to create digital representations of their idols, and he is in the process of obtaining permissions.
Zhang Zewei demonstrating face recreation. (Photo: Sixthtone).
For services involving deep psychological interaction, in addition to using AI technology to mimic people’s appearances and voices, Zhang also hires psychologists to play the role of the deceased. Unlike Yang’s grandmother, customers using this service understand that the person on screen is not real. Essentially, it becomes a role-playing game with therapeutic effects.
Being aware of the potential ethical concerns surrounding his services, Zhang signs agreements with clients to protect their privacy and stipulates that the digital AI representation cannot be used for any illegal or inappropriate purposes. If clients wish to “resurrect” a deceased person, they must provide proof of their relationship.
Initially, Zhang felt complex emotions about providing such services. He stated that he hardly experienced much suffering himself, but suddenly had to endure the pain of hundreds of clients. “It has a huge impact on me. I often obsess over their circumstances every night,” he recounted. Over time, his emotions have “become numb.”
Professor Shen also cautioned that good intentions do not always lead to good outcomes. Although the use of AI in this way aims to provide psychological support, AI simulations of loved ones cannot completely replace real people. This service could lead some clients to develop illusions, thinking they can truly communicate with their deceased loved ones, which may result in trauma and dependency or hinder natural recovery processes.
“The long-term effects of such services must be considered, including the potential for developing unhealthy attachments to the deceased,” Professor Shen stated.