Everything has a cost, even the most revolutionary, positively transformational costs. Socrates worried that the invention of writing would ruin people’s ability to memorize things, because writing would create “external” memories that would reduce the need to memorize anything. Consider that, at that time, Homer’s Iliad and Odyssey, about 800 pages in total, were not “written”, they were composed mentally, in commune, and sung and passed down by word of mouth. When Google emerged, researchers worried that Google was making people dumber. The term, “Google effect” has come to refer to the way people easily forget things because information is so readily available. These two technological advances are not just important, it is hard to argue that they were bad for society. They allowed more knowledge to be produced and made that knowledge far more accessible. They did, however, have a price.
Uniformity of Thinking
The social psychologist, Jonathan Haidt, has been researching on platforms X (formerley Twitter), and Facebook, for nearly a decade. His conclusions about the costs of social media and its impact on democracy, are not optimistic. During a conversation with Bill Whitaker on 60 Minutes, he remarked that, "I would certainly say that the platforms are an existential threat to American democracy. It's not a good versus evil story. It's a question of changed ecosystems in which democracy can't grow and thrive." Social media has connected the world. That is undeniable. Yet, Haidt’s research shows that increased use of social media is linked to a decline in the functioning of a democracy. The reason is simple: platforms are fighting for your attention and competing for eyeballs. Engagement is their business, not democracy promotion. That’s not a criticism: your local bakery doesn’t exist to promote democracy either. What makes social media harmful is that this engagement is won by sharing critical content. Think about it: dunking by reposting content you hate and getting other people to say how bad it is; talking about how evil the other guys are; and so on, are ridiculously popular engagement tactics. Quick question: how many people emerged out of poverty last year across the world? You don’t know, but that kind of story doesn’t win engagement. Human beings lived in hostile environments for most of the last 300,000 years, and it made sense to be very sensitive to threats. We have adapted to be more sensitive to threats than good things. A 2021 report found that a post is 67% more likely to gain engagement if it attacks the political opposition. Anger, that most primal of emotions, is the emotion that defines social media.
Politicians are similarly affected, with research showing a rise in angry posts by members of Congress, who send 100,000 posts a month to their audience of 250 million Americans, who reward this with tens of millions of reactions every month. Anger pays. But democracy is about compromise, it’s about talking to the other side, and graciously accepting when you lose the argument.
AI may make this worse. Although ChatGPT is seen as THE AI product, TikTok is perhaps the most successful AI product before that. TikTok is not a social media platform. The content you see on TikTok are algorithmically sorted from all the content on the entire platform. Until recently, Instagram, which is a social media platform, chronologically sorted content from your social media network. With competition from TikTok, Instagram has shifted its model to look more like TikTok’s: most of your content will increasingly come from what an algorithm believes is the best content on Instagram. This has already had real-world effects. Coffee shop designs across the world look more and more alike, because designers are all getting the same recommendations from Instagram. Researchers have been able to detect when a scientific article has been written by ChatGPT, because ChatGPT tends to use words like “delve”, “explore”, “tapestry”, “testament” and “leverage”, which are ChatGPT favorites. That will impact how scientists write. Homogeneity, or uniformity, are a consequence of algorithms governing our lives. Similar playlists, buildings that look alike, and, I believe, political opinions that are very uniform.
We are entering an election cycle when it has never been so easy to get everyone to think the same thing, and given how political camps are in separate information funnels, we could witness the first election where AI gets vast swathes of people in either camp to be outraged by a specific set of things, and to make arguments using the same talking points. This could be the equivalent of introducing the nuclear bomb to warfare. Social media has already corroded discourse, imagine what generative AI on social media can do.