麻豆视频

skip to main content

Griefbots and the Digital Afterlife

This week on 鈥淲aking Up With AI,鈥 Katherine Forrest and Anna Gressel introduce griefbots, AI models that simulate the deceased, examining the legal, ethical and financial issues that arise when technology offers a digital afterlife.

  • Guests & Resources
  • Transcript

Katherine Forrest: Hey, folks. Welcome to another episode of 鈥淲aking Up With AI,鈥 a Paul, Weiss podcast. I'm Katherine Forrest.

Anna Gressel: And I'm Anna Gressel.

Katherine Forrest: And you know, Anna, I can see, nobody else can see, but I can see that we're both in our offices. And we're about 20 feet away from each other. And you know, there are people, I've learned 鈥 and I've said this before, but I'm always so amused by it 鈥 that get their information on your whereabouts, particularly the Anna Gressel whereabouts, from our podcast. Because you're always all over the place.

Anna Gressel: I know! It's true. I had someone message me, like an old friend, and say, oh this is what's up in your life. And I was like, how in the world did you know that? But then, we actually do tell everyone.

Katherine Forrest: Right, right.

Anna Gressel: But yeah, no, we're both in New York. And I haven't even seen you in person today. And we're like 20 feet down the hall, which is a real rarity.

Katherine Forrest: I know, I know. It's so funny. So close, yet so far. But, you know, we're in New York. You've been here two weeks, which is a virtual marathon for you.

Anna Gressel: Definitely. Yep.

Katherine Forrest: So, I had this idea for this podcast, which you've so kindly agreed to do. And it comes out of this book that, you know, I'm writing, that I talk about, that one day I'll actually finish. It's because it's actually due, by the way, the book is due. But it's called 鈥淥f Another Mind,鈥 and you know all about that, right?

Anna Gressel: I do, I do. And I'm so impressed that you write books on the weekends, and I don't even know what I do that stacks up to that, but I'm super excited to read it.

Katherine Forrest: It's a fun one. But it's all about AI and the different kind of mind that AI has and a different kind of mind from one that we humans have.

Anna Gressel: Totally.

Katherine Forrest: And the research that I have been doing for this book, it brings me to all kinds of places. Some of which are exactly the kinds of places that you would imagine, like AI and sentience, AI and consciousness, what kinds of personalities are we trying to put into AI, all kinds of things. But one thing that I started to run across was AI and ghost AI, AI and the construction of griefbots, something called griefbots.

Anna Gressel: Yeah, it's easy to anthropomorphize, kind of like we do with animals or objects, But it's so easy to anthropomorphize with AI, in part because I think we have this long-storied sci-fi tradition that we're drawing on when we interact with them. And then also I think there's maybe just a human instinct to do that.

Katherine Forrest: Right, by the way, before we get into this griefbots thing, do you ever name or ask your AI chatbot its name?

Anna Gressel: No.

Katherine Forrest: Not once, not one time?

Anna Gressel: No, I think, but I'm also not someone who names their objects that much. And I know you have a car with a name. I think I'm like, I love my headphones, but my headphones don't get a name in my life, you know?

Katherine Forrest: Well, my headphones don't have a name. I mean, I'm not that strange. All right, well, because my wife's chatbot's name is Quinn. And her friend has one that gives her sort of romantic advice, and his name is George.

Anna Gressel: What?

Katherine Forrest: Yeah. Yeah.

Anna Gressel: We have to do an episode on that!

Katherine Forrest: Right. Well, apparently, it's really good advice. George, in particular, he's got a high EQ. But anyway, I digress from all of this, the point about personalities and chatbots, and we could go on and on and on about them. In the context of this new phenomena called griefbots is that they really are becoming a thing, and I thought we could spend some time today talking about them.

Anna Gressel: Yeah, I would love to. I think it's a completely amazing and interesting topic. And I think there's starting to be some attention on those too these days.

Katherine Forrest: Right. They're like little mini large language models, although, you know, you use the word 鈥渕ini鈥 and you use the word 鈥渓arge language model,鈥 and it seems like it's somehow inconsistent, but it's not. It's really sort of a model on top of a model. It's sort of a fine-tuned model that is a fine-tuned model of a particular person that's running on top of a base LLM.

Anna Gressel: Yeah, and I think we're specifically, when we talk about griefbots, talking about a model that is trained to act like or simulate someone who is deceased. Right, Katherine?

Katherine Forrest: Yeah, you're right. That's the whole point of the griefbot versus just a chat-bot. And the concept is, and this is real, this is not fake, folks. This is not something that is in the future that we're making up. But the concept, which is being marketed today, is that you can upload all of the data that you have about a person, maybe even before they die, maybe even the person would do it themselves. And you would get their letters and you'd get their audio tapes and you would get maybe their social media. You'd get everything you could about this person. And that corpus of material would be used to train a model to essentially be a chatbot that responds and 鈥渁cts鈥 like that person.

Anna Gressel: Yeah, I mean, this doesn't surprise me, in a sense. There are a lot of people who kind of reach end-of-life stages and write memoirs so that they can pass down stories to their family they've never told them, for example. There are all kinds of people. I know my cousins at one point did an interview of someone in our family who is terminally ill so they could hear those stories and have a video of someone. And so, there's a lot of end-of-life planning, obviously, that people do. And it's interesting to think about planning for kind of an interactive version of that. But that's not new in the sense of, again, this has been terrain that we've seen trod for a long time in the world of sci-fi. The idea that, if you got there in time, capture someone's essence and store it and make it interactive and maybe bring that person to life in some way or at least create a version that people could hold onto for a little bit longer. So it's quite interesting now that some people are starting to really do that. Some companies are starting to do that, and they're offering to do interviews of a person who might be terminally ill and other people who know them to kind of gather this corpus of information because even in the sci-fi world you have to store the information before you can replicate someone, right? The idea is you have to capture it to use it going forward, and so it's kind of actually kind of amazing to me that we're at this point that companies are starting to think about that.

Katherine Forrest: Right, when you talk about interviewing someone at end of life or trying to store the information, one of the things that some of these companies do is provide audio. You can have a back and forth, and I saw a demo of one where it someone talking to their now deceased mother. And so part of the exercise of gathering the information is to actually gather an audio print of the person who is either about to be deceased or is then later deceased. And so that voice print can be used, essentially, to make an audio deepfake of that person that can respond as the person would respond through the information given to the chatbot. So the chatbot isn't just a text-based chatbot, it can also be an audio-based chatbot, and also presumably a video, though I've not seen a demo of a video one.

Anna Gressel: Yeah, I mean, I haven't seen a demo of a video one in our time. Obviously, that concept exists in a lot of fiction. The idea of kind of embodying someone or making their video come to life. It's certainly terrain that has been trod. But in terms of situating ourselves where we are today, presumably the concept would be you'd record this information about someone. You'd have this data, that text-based data about what they had said. You'd record their voice, and you could create a voice print. So meaning you'd have this mathematical representation of their voice. And then you could hear their voice speak words or some simulation of the words and maybe the personality, even, that came through that text. They would have some information about that person. They would also have their syntax, how they spoke, what kinds of words they would typically use. Little like turns of speech, right? And so the idea is you could capture all of that, and then have this interactive chatbot, maybe with a voice. So Katherine, I don't know if you think that there's other stuff on the market, but I think this is kind of like where these grief bots are roughly headed, right?

Katherine Forrest: No, I think you're absolutely right. I think the personality is a big part of it. And it reminds me of this 鈥淏lack Mirror鈥 episode that goes way back to 2013. You know, you were talking about sci-fi just a few minutes ago. And sci-fi sometimes really does anticipate the future. Oftentimes it does not, but there are times that sci-fi actually can anticipate the future. And there was this episode in 鈥淏lack Mirror鈥 called 鈥淏e Right Back.鈥 Did you ever see 鈥淏lack Mirror鈥?

Anna Gressel: Mm-hmm, yeah, for sure.

Katherine Forrest: And did you ever see this particular episode? Do you know the one I'm talking about?

Anna Gressel: Yeah, the man and the woman are in love, and they move out to the countryside. And then the man is killed the day after the move. And the main character, Martha, learns of a service that creates a virtual avatar of deceased loved ones using, for example, their social media posts or email.

Katherine Forrest: Right, and she has all of these interactions with this avatar over the progress of the episode. And at first, they're sending each other messages, but eventually Martha signs up for the experimental feature that allows the avatar to be embodied in an android that simulates the likeness of her deceased partner. And she actually ultimately ends up having some disgust for what this feature has enabled, but we won't go into that any further except to say that there are ways in which these griefbots could play themselves out not so very different from something that was imagined, now, literally over a dozen years ago.

Anna Gressel: Yeah, and I think, in preparation for this, I watched a movie that I hadn't seen before called 鈥淎rchive鈥 just over this weekend. And it was really interesting. I mean, it raises questions, really interesting questions about what happens when you create multiple versions of these griefbots and they become aware that, you know, they're not the only one and just...

Katherine Forrest: Wait, hold on, that's one of the ideas I had. I thought, what would happen if my siblings and I each created our own version of a griefbot for a deceased parent and one person got their version of the parent, and somebody else got their version of the parent, and those two parents were in some ways different? I mean, wouldn't that be bizarre?

Anna Gressel: Bizarre, but also from someone who has siblings, isn't that so true too? Like your parents are never quite the same for every sibling. So it also doesn't feel like a long shot, right? Like, you know, we all have such unique relationships. And so do you get that relationship? Or is the person like their own person? I mean, you can see then why this raises such interesting questions. We're just beginning to scrape the surface, and it's just such a fascinating area.

Katherine Forrest: Right, and then we have to ask ourselves, you know, would we want this to happen to us? So let's talk about some of those legal and ethical issues that get raised.

Anna Gressel: Yeah, I mean, we can walk through a few of them. There are so many, but we'll walk through.

Katherine Forrest: Right.

Anna Gressel: So for example, with training any model, it's just like any AI out there, you need a lot of material, a big corpus of data for a griefbot. That could be diaries, letters, text messages, social media posts, family videos, other videos, like I have videos of me on YouTube, right? So do many people. And if they were a writer, an academic, maybe their articles would be part of that data corpus.

Katherine Forrest: No, it really could be anything.

Anna Gressel: Yeah, and this potentially raises some interesting IP questions. That is, who owns the letters, the video, the photographs? I'm laughing a little bit because my sister's a writer, so if she wrote about me, would she have to consent to someone else making a griefbot of me using her book, right? So you can see how that would come up.

Katherine Forrest: Well, you're right. I mean, some of these issues are really complicated. You know, who owns if you've got a YouTube post with video and audio, but it's actually a conference segment and it's owned by X company that sponsored the conference or X academic institution? You can see that these things can get pretty complicated.

Anna Gressel: For sure. There might be children who would have rights to their own photos, but not rights to some of the letters or from different publishers or academic institutions who might own copyrights. There might be also fair use issues that could come into play for copyrighted material, depending on how courts actually came out with currently pending cases or saw this kind of personal use maybe as a little bit different than commercial uses. So it's quite interesting how that might actually play out in practice.

Katherine Forrest: And then there could also be right of publicity and name and likeness issues. And in some states, those get extinguished at death. And in some states, they don't, but they might change. And it can be a state-by-state thing. And then you've also got the issue of people who are living who might be captured in the material that you're gathering up for this deceased person or soon-to-be-deceased person. You might have family photos that have lots of people in them, many of whom are still living. You might have video, you might have people who are responding to letters, and all of those pieces of information relating to these third parties can raise their own issues.

Anna Gressel: Yeah, and another legal issue could be if the griefbot makes a statement, and the statement is disseminated, there may be a question of whether there's a First Amendment right attached to that so-called speech, if we even think it's speech at all.

Katherine Forrest: Wow, actually that's really fascinating, isn't it? You've got a griefbot who's speaking in the voice of a deceased person. Is that speech? I guess the way I would analyze that is to say that it would still be a model that is emitting the answer to a query of some sort. But boy, I'm just thinking about this off the cuff. You'd really have to think about it. You might also have a situation where there is a claim made by somebody that the model has actually drifted. You know, we talk about model drift, when the model starts to perform in ways that are different from how they originally performed. And so, imagine you've got a griefbot. It's originally architected and designed and trained to act like a particular person. But then it starts to go a little rogue. It might drift a little bit, and it might start giving advice or responses that are not the kind of advice or responses that the individual who this model was originally trained to personify actually would have given.

Anna Gressel: Mm-hmm. I mean, you can imagine how that would be if you had someone who is a griefbot of like a religious figure giving that kind of advice. I mean, you could easily see a very interesting ethical overlay onto all of that. But there are probably going to be a lot of disclaimers, just like in terms of the folks making these products available. I'm sure a huge amount of thought will go into how they're released. And then there'll be a lot of questions about how they can be used ethically downstream by the users themselves, you know, far beyond the developers.

Katherine Forrest: Right, right, right. And then there's questions like, what about the financial transaction aspect of the griefbot? It's going to be probably through some sort of licensing or subscription model. And what if the company that's actually hosting the model goes out of business? And you then have a question over the ownership of the griefbot. Is there any ownership, right? Do you gain an ownership right over time just as a matter of equity? Or can the griefbot be extinguished entirely? These are really unknown issues, and I'll tell you right now that the episodic television show, which I watched all of, called 鈥淯pload鈥 does not answer all of these questions.

Anna Gressel: Yeah and, well, that is true. 鈥淯pload鈥 is great and certainly not a show for lawyers on all the legal issue spotting involved. But it does raise really interesting questions about, are these subscriptions that could be terminated? That could be really complicated and raise issues that are emotional but also family legal squabbles. And there are also maybe issues about how griefbots might or might not be inherited, for example. You kind of think about these potentially passing down between generations. So it's these very, very interesting questions about how to treat digital property and digital media. Some of this played out with the original digitalization of music libraries, for example. Some of those questions were raised then and now. They're playing out in different context here.

Katherine Forrest: Yeah, but they didn't talk back to you at that point, right?

Anna Gressel: True, very true.

Katherine Forrest: Well, you know, there's also another ethical issue that I want to raise, which is does the deceased or the individual to be deceased 鈥 I mean, we're all, as we say in life, you know, the only things that are really known are that there's birth, death and taxes. So we're all going to be a deceased at some point in time. Well, until there's some sort of fix to that. But anyway, here, let me get back to my ethical issue. Does the deceased or soon-to-be-deceased person get any say in whether or not they are captured in any kind of digital afterlife? You know, one could imagine a provision in a will that would say, yes, feel free to make a griefbot or by the way, here's my bequest for you to make a griefbot. Or one that would say, absolutely do not make a griefbot. And I suppose, if you were really going to actualize that last one, what you'd do is you would tie the revocation of some bequest to that. And I'm not giving any advice, I'm not giving any legal advice. I'm just thinking through these issues as you and I talk about them right now here as we go.

Anna Gressel: Yeah, but it's, I mean, the way that you phrased that, it raises a really interesting temporal question, which is, some of these things are done in someone's lifetime, like some of the ways that you might record someone or collect information, and some can be done after their lifetime. And the way you control that as a person is different. Your levers for controlling what happens when you're alive and after you're deceased are very different. But this is just completely fascinating. You can imagine all kinds of trusts that deal with likeness issues and IP rights of people having to now deal with this kind of a question in the same way that they might deal with the question of whether a biography could be created of someone, and who would control that process.

Katherine Forrest: Right, these are fascinating issues. And in subsequent episodes maybe we'll explore some of the products and how they're actually doing right now on the market. But for now, that's all we've got time for, Anna. I'm Katherine Forrest.

Anna Gressel: And I'm Anna Gressel. Make sure to like and share the podcast if you're enjoying it.

© 2025 Paul, Weiss, Rifkind, Wharton & Garrison LLP

Privacy Policy