Before I begin, a word to my fellow tech nerds. Guys. Guys. I’m trying here, I really am. I am trying to remind people that not everyone associated with technology, even in Silicon Valley, is a lunatic ghoul. But you got to give me something to work with, here. Getting rid of grief via chatbots??? I mean, what next? Robots to take the joy out of walking your dog? A little help is all I am asking. Could we try not to live up to all the stereotypes, please?
Because that is what this story is about: a tech company that thinks it is a good idea to recreate a personality so that people won’t miss them when they are gone. Now, put aside that you cannot recreate a personality in a chatbot, not really. People are not the sum of their written correspondence, and things like hallucinations and the rigidity of the models will quickly make it clear that you are not speaking to the actual person you miss. The real issue is that getting rid of emotions, even unpleasant emotions, is a terrible idea.
Grief is part of what makes us human. I have buried too many people, including people that were far too young to have their life taken from them. To this day, I miss those people, even the ones that were buried twenty or thirty years ago. And that is okay. It is not pleasant (and for those of you dealing with recent grief, I can offer that, eventually, the memories are more pleasant than painful), but it is a part of being a fully formed human being.
Grief reminds us that lives have value. Grief reminds us that people are worth fighting to save. Grief connects us to each other in a profound way. The pain of an absence encourages us to try and prevent those absences, not just for us but for everyone. Grief, in a very real way, is a spur to our better natures, a warning about what awaits if we give up on our fellow human beings. Grief propels us, in the best cases, into a deeper appreciation of the value of all people. Grief, in part, is what motivates the outpouring of help after natural disasters. Decent people won’t want others to experience the losses they have. Grief, in a very concrete way, makes us human.
All of our emotions do. Loss, regret — they all teach us something about the value of other people, about the consequences of our choices on ourselves and those around us, about how to be a member of a society. The short-term relief is not worth the long-term loss. And, as I said, I have no doubt that this will be a glitchy simulacrum at best. But the very idea that you would want to keep people from experiencing real, human emotions betrays a deep, deep inability to understand your fellow human beings. No one needs a chatbot to take away their grief. We need people to help us work though that grief, yes, but not to eliminate it. Emotions are what make us human.
It would be nice if our tech overlords remembered that.