NY Magazine has a heartbreaking story about a woman, her son, and her sister who died after the woman became convinced that the world was about to end. To protect her family, she took them “off the grid” based on an incomplete understanding of what that entailed. They died, starving and frozen, that winter.
The focus of the story is how a women decided to put her family in that kind of danger, with the understanding being it is primarily men who make these kinds of rash, power hungry, and/or paranoid decisions. But I keep coming back to how the internet seems, today, almost designed to kill people like this woman and her family. Yes, she and her sister obviously had mental health issues. And yes, the boy’s father needed to intervene. But it is clear from the article that she was convinced by things she read on the internet that she could go off the grid and that she needed to in order to protect her son from the coming apocalypse.
The story does not talk much about social media, outside the woman banning her son from it. But I cannot help but wonder how she found the blogs and articles and YouTube videos that radicalized her. We know that Section 230 has been weaponized by content and social media companies in order to keep them from any consequences of what their algorithms do to engage people. We know that companies like Facebook deliberately use their algorithms to drive time on the site, regardless of the harm. We know that the YouTube recommendation algorithms can turn radical very quickly. And we know that people suffering mental illness and isolation can overuse these products.
And yet we do nothing.
We keep pretending that algorithms are speech not products. We keep insisting that it will be the end of speech online if the companies that manipulate what people see are held to account for the results of those manipulations. We keep yelling that it is fine to create systems that attempt to use base human emotions to keep people enraged and thus engaged with social media products. None of the above is true, but it doesn’t matter because the very idea that these companies should be one penny poorer, or that anyone should ever be held account for their actions, is anathema to people who worship the internet and the private marketplace.
We are a society. We are supposed to take care of each other. There is a stretch of road near my house where the speed limit changes rapidly from 45 to 25 and then back again to 45 within a few hundred feet. It is setup like this because that stretch of road passes by an elementary school. Efficient driving gives way to protecting children and those responsible for them — as it should. And as it should on the internet.
Nothing will ever save everyone like this woman. But it is possible that by holding companies responsible for the results of their algorithms, by refusing to put profit over lives, by shaming people who apply their skills to harming others, we could save some of them. And at the cost of nothing important — a few companies become merely obscenely wealthy instead of murderously wealthy. No one’s speech would be curtailed — having algorithms that don’t encourage people to go down rabbit holes or to get angry in order to stay on a specific webpage will not keep anyone from publishing a damn thing. But it would likely save some people from ruin — like a society should.