The future of death

Is death still the end? Or does advancing technology provide us, already, with ways to live on.
Futurist Scott Smith 鈥 a visiting research fellow with for the creation of its FOREVER exhibition 鈥 examines a new possible artificial intelligence afterlife and questions its implications.
Throughout history, humans have sought to communicate with the dead across cultures, from spirit-raising rituals to Victorian s茅ances and Edison鈥檚 spirit phone.聽
Today, as with nearly everything else, AI seeks to continue this tradition. There鈥檚 growing interest in using AI to create chatbots or virtual avatars that mimic the deceased, enabling 鈥渘aturalistic鈥 interactions with digital models of the departed.聽
Applications like Replika, launched in 2017, allow users to create or recreate avatars trained on datasets to respond as desired, such as a friend or romantic partner. A segment of these adapted general chatbots has more recently become focused on specific replication of the deceased, giving us an array of terms, including 鈥渄eathbot鈥, coined by Patrick Stokes in his 2021 book Digital Souls: A Philosophy of Online Death, and 鈥済riefbot鈥.
Now, a veritable congregation of chatbots is available to users to communicate with a digital recreation, including StoryFile, HereafterAI, You, Only Virtual, Forever Identity, and SeanceAI. In 2022, Amazon demonstrated a scenario that included its Alexa smart home device reading a story to a child in the voice of his deceased grandmother.聽
Even generative AI tools like ChatGPT can be transformed into a personalised griefbot if trained with sufficient text from the deceased to provide tone of voice and word choice. This has enabled technically savvy individuals to create chatbot personas of loved ones. As this class of technology is increasingly integrated into everyday technology like office software and smartphone apps, building such chatbots is becoming more accessible to a non-technical public.聽
Not surprisingly, the new frontier of death tech, particularly the digital re-construction of real individuals post-death as a way to maintain a semblance of contact, or use of imaginary figures intended to help with emotional trauma or grief, has not been without complication and controversy. Poor quality recreations, unexpected service shutdowns that delete digital loved ones, unauthorised depictions or reproductions, or bots that amplify undesirable aspects of personas drawn from life are among the actual challenges that have been reported in recent years.聽
This is leading to the development of additional services that offer to protect an individual鈥檚 likeness, anticipatory copyright of likeness as a way to lock out competitors in the case of some celebrities, and, of course, attempts to extend image rights after death. In the future, memories of some people may be more valuable by orders of magnitude than their live presence could be.聽
The frontier is moving quickly. Applications of this approach are also being used controversially as a form of activism. In early 2024, parents of children killed in several high-profile school shootings in the US collaborated with voice Al company ElevenLabs to recreate the voices of their lost loved ones in automated phone calls to Congress members in their campaign for stricter gun controls in that country. Each uses samples built from recordings of children, delivering posthumous messages to legislators about how they died.聽
While this approach will undoubtedly stir heated debate, it will also likely drive demand for voice clones of the dead more broadly.

Ethics in an AI afterlife
Some of us take for granted that fragments of our actual as well as digital lives will remain visible online for an indeterminate time. This includes everything from photos reposted across various social media to articles, profiles, comments, podcasts, tweets, WhatsApps, doorbell videos, or chat messages, to images of now passed loved ones captured, and found, on Google StreetView.聽
If you鈥檝e contributed any form of content or visible interaction to the Web at all, it lives on in redundant servers, or in archives such as the Internet Archive鈥檚 Wayback Machine. We鈥檙e likely to be remembered whether we want to be or not 鈥 on digital CCTV, in the background of someone鈥檚 selfie, or as a fragment deep inside some company鈥檚 lucrative AI large language model. For some, these digital traces represent a way to remember the lost.聽
Writing in the New Yorker in 2015, author Matthew Malady recounted seeing his deceased mother in a StreetView image: 鈥淭he confluence of emotions, when I registered what I was looking at, was unlike anything I had ever experienced 鈥 something akin to the simultaneous rush of a million overlapping feelings. There was joy, certainly 鈥 鈥楳om! I found you! Can you believe it?鈥 鈥 but also deep, deep sadness.鈥澛
Malady鈥檚 reaction is a good summary of how many might feel on their first encounter with such a digital memory. However, beyond the emotion lay questions about who owns and maintains that blurred memory, which, unlike a memorial in a cemetery or at a roadside, can鈥檛 be easily refreshed or taken down by relatives.聽
There are emerging regulations around deep fakes, however, these don鈥檛 address specific post-mortem situations. The race to build posthumous AI avatars raises ethical and emotional considerations about the nature of grief and memory, as well as considerations about control and privacy, and run far ahead of existing regulatory frameworks.聽
With our ever-growing digital footprints, there is a wealth of data to pull from to give a loved one a second life, much of which can be reconstructed through the use of tools such as voice generators and language models.聽
In 2022, Amazon demonstrated a scenario that included its Alexa smart home device reading a story to a child in the voice of his deceased grandmother
This emerging technology prompts numerous questions: How does interacting with a digital version of a lost loved one impact聽the grieving process? Who controls these digital models, and for how long? What are the ethical boundaries of recreating someone posthumously, and who has the right to do so? How is the data for these avatars sourced, and does control of that data confer rights to create such avatars?聽
Should these digital representations be treated as heirlooms, passed down through generations, or should they have an expiration date? Additionally, how can we prevent emotional abuse through malicious use of these avatars?聽
These questions underscore the need for careful consideration of the long-term implications of posthumous AI technology on individuals, families, and society as a whole.聽
was a visiting research fellow at UniSA, working with MOD. to create the FOREVER exhibition, examining what the future of death could, or should, look like. He is a US-educated global futurist and author. This article is an excerpt from his essay on the topic which forms part of the book accompanying the exhibition. The book is available for purchase online or in-person from MOD.聽
Images created by Lachlan Wallace, Communications Officer for the University of 亚洲色吧, using ChatGPT.聽