Two similar stories today showcase how we are being sold, manipulated, and possibly put in harms way by data brokers.
First, someone is selling the location data of people who visited Planned Parenthood clinics in order to run anti-choice ads:
The company’s data can be used to target ads to people who have been to specific locations — including reproductive health clinic locations, according to Recrue Media co-founder Steven Bogue, who told Wyden’s staff his firm used the company’s data for a national anti-abortion ad blitz between 2019 and 2022.
No one meaningfully consented to this kind of use of their data. I promise you, no one said “sure, sell one of my most personal pieces of information to complete strangers who can then use that data to harass me with ads attacking my life decisions and healthcare needs.” Pretty sure nothing like that was in any of the disclosure boilerplate that passes for consent in this country.
Second, “romance” AIs are keeping and selling personal information:
Mozilla dug into 11 different AI romance chatbots, including popular apps such as Replika, Chai, Romantic AI, EVA AI Chat Bot & Soulmate, and CrushOn.AI. Every single one earned the Privacy Not Included label, putting these chatbots among the worst categories of products Mozilla has ever reviewed. The apps mentioned in this story didn’t immediately respond to requests for comment.
You’ve heard stories about data problems before, but according to Mozilla, AI girlfriends violate your privacy in “disturbing new ways.” For example, CrushOn.AI collects details including information about sexual health, use of medication, and gender-affirming care. 90% of the apps may sell or share user data for targeted ads and other purposes, and more than half won’t let you delete the data they collect. Security was also a problem. Only one app, Genesia AI Friend & Partner, met Mozilla’s minimum security standards.
Okay, yes: haha people think a chatbot is their girlfriend. Except that these companies advertise themselves as a way to get past loneliness and improve mental health. And then they take deeply personal health information and sell it to whomever asks. It is a massive betrayal of people, some of whom may be already fragile. Even assuming they don’t get hacked — which, according the security experts who contributed to the report, they probably will — the betrayal of trust is immense. And, again, I would bet all I own that the consent forms do not disclose just exactly what is taken form users and exactly what is done with what is taken.
It should not be like this. Your phone should not be allowed to store your location data. Companies should not be allowed to take whatever they want about your personal lives behind flimsy, poorly written and understood “consent” forms and use that information against you. People cannot be people without privacy — it is a central facet of humans’ makeup. Letting ourselves be used and abused in this fashion so that some ghouls can make a few extra dollars selling our medical, emotional, and most private information is insane.
If you were try and start these businesses now by telling people that their every movement would be tracked and sold and every detail of their use of medications or visit to therapists would be available on the cheap to anyone who ponied up no one would agree to those conditions. Especially since all we get out of the deal are “personalized” ads. All that keeps these companies alive is the fact that they already exist, their ability to lobby, and the relatively unknown harms they perpetuate every day. None of those are good reasons to keep allowing businesses to abuse our privacy. They, and the aspects of the tools that assist them, should be made illegal.