Sometimes I wonder if we really deserve to survive as a species. Too many of us delight in shaming, embarrassing, and even driving to suicide other human beings who have done nothing except exist. Whether they are the wrong religion, or the wrong skin color, or think the wrong thoughts, or like the wrong people, far too many of us take a sadistic delight in making other people miserable for no other reason than because they can. And others, while “deploring” the situation, do nothing, or even abet it because it makes the money.
I could be describing Trump. And he does fit this description. But I have a different target in mind this time, one that Nicholas Kristof writes about in the NYT: The Online Degradation of Women and Girls That We Meet With a Shrug.
Alarms are blaring about artificial intelligence deepfakes that manipulate voters, like the robocall sounding like President Biden that went to New Hampshire households, or the fake video of Taylor Swift endorsing Donald Trump.
Yet there’s actually a far bigger problem with deepfakes that we haven’t paid enough attention to: deepfake nude videos and photos that humiliate celebrities and unknown children alike. One recent study found that 98 percent of deepfake videos online were pornographic and that 99 percent of those targeted were women or girls.
This is how we use the technology we have spent so much time and energy developing: creating pornographic images of people, almost entirely women and girls, who did not consent and who had no idea this was happening to them until it is far far too late.
But it’s not just those who create these monstrosities. It’s also those who carelessly or even knowingly profit off them:
Companies make money by selling advertising and premium subscriptions for websites hosting fake sex videos of famous female actresses, singers, influencers, princesses and politicians. Google directs traffic to these graphic videos, and victims have little recourse.
And not just the famous:
Sometimes the victims are underage girls.
Kristof goes on:
Sophie Compton, a documentary maker, made a film on the topic, “Another Body,” and was so appalled that she started a campaign and website, MyImageMyChoice.org, to push for change.
“It’s become a kind of crazy industry, completely based on the violation of consent,” Compton said.
The impunity reflects a blasé attitude toward the humiliation of victims. One survey found that 74 percent of deepfake pornography users reported not feeling guilty about watching the videos. [emphasis added]
I pasted this specifically to include the link to the MyImageMyChoice website, which announces:
Intimate image abuse can have devastating, even life-threatening, impacts. But governments and tech platforms aren’t doing anything to address it. Intimate image abuse websites, based on violating consent, have become thriving online businesses. Companies like Google, Visa, Verizon are enabling and profiting off this abuse, and normalizing misogyny.
We are campaigning for governments and tech companies to #BlockMrDeepfakes and the 3000+ sites dedicated to online gendered abuse.
But also take note of the bolded passage above. A “blasé attitude toward the humiliation of victims.” This is the real problem. These sites would have no followers, and their enablers would make no money off them, if we as a society would train each generation to have respect for each of us, to grant everyone the dignity to which every human being is entitled to. We were making some progress in this direction. Marital rape is now a crime (40 years ago, it wasn’t necessarily so). We are gaining a better understanding of the reality of sexual variation. Women are still behind in the workforce but are catching up.
All that progress and more is now in danger, because of White Male Supremacy, religious intolerance of women’s equality, the dismissal of empathy as a sign of weakness. Add others to the list.
And the other thing is, the search engines that drive this latest insanity could fix it if they wanted to.
As I see it, Google and other search engines are recklessly directing traffic to porn sites with nonconsensual deepfakes. Google is essential to the business model of these malicious companies. . . Google is socially responsible when it wants to be, but it seems indifferent to women and girls being violated by pornographers.
Indifferent. Almost as bad — no, just as bad — as making these images in the first place. Kristof quotes some Google and Microsoft executives who say they are working on the problem, but as he says, “Count me unimpressed.”
Kristof suggests some possible ways to help:
I’m in favor of trying to crack down on deepfakes with criminal law, but it’s easy to pass a law and difficult to enforce it. A more effective tool might be simpler: civil liability for damages these deepfakes cause. Tech companies are now largely excused from liability under Section 230 of the Communications Decency Act, but if this were amended and companies knew that they faced lawsuits and had to pay damages, their incentives would change and they would police themselves. And the business model of some deepfake companies would collapse.
But that’s only part of what is needed.
The greatest obstacles to regulating deepfakes, I’ve come to believe, aren’t technical or legal — although those are real — but simply our collective complacency. . .
It’s time for similar accountability in the digital space. New technologies are arriving, yes, but we needn’t bow to them. It astonishes me that society apparently believes that women and girls must accept being tormented by demeaning imagery.
I am not astonished. Disappointed, depressed, dismayed. But not astonished. This is what always happens when one part of our species refuses to accept all other parts as equal, as human as everyone, as entitled to dignity. And the more so when it is driven by religion, by politics, by greed.