As Internet companies like YouTube and Facebook struggle with the deluge of far-right extremism, racial bigotry, and conspiracy theories that have filled their platforms, it’s becoming increasingly clear that there’s one very simple and yet insurmountable reason they haven’t been able to get it under control: their revenue streams are built around attracting such content.
YouTube executives, as an incisive piece by Mark Bergen at Bloomberg News laid bare this week, have been lackadaisical about the problem over the years that it has accumulated. "Scores of people inside YouTube and Google, its owner, raised concerns about the mass of false, incendiary and toxic content that the world’s largest video site surfaced and spread,” Bergen reported. “Each time they got the same basic response: Don’t rock the boat."
The problem is similar at Facebook, except that there appears to be an obtuse corporate cultural issue making matters worse. After it announced last week that it planned to ban all white-nationalist and white-supremacist content, a simple test of Facebook’s system this week by a Huffington Post reporter revealed it all to be an utter sham.
Reporter Andy Campbell showed a Facebook spokesperson a video and other content by Canadian white nationalist Faith Goldy, and was informed that it didn’t violate the new Facebook standards—even though it was an outrageously straightforward piece of white-nationalist propaganda, rife with racial bigotry and anti-Semitism.
The video, titled “Race Against Time,” is a classic racist screed in which Goldy rails for five minutes “against people of color and Jews―especially those immigrating to predominantly white countries―who she says are ‘replacing’ white populations in Europe, the United States and Canada.”
As Campbell notes, such complaints about “replacement” are part of a broader white-nationalist propaganda campaign against multiculturalism, which they identify as a form of “white genocide.” Marchers at Charlottesville chanted “You will not replace us!,” and the Christchurch terrorist penned a hateful screed he titled “The Great Replacement.”
Getting these platforms to clamp down on speech that helps fuel racial violence—notably including conspiracy-theorist content that scapegoats targeted minorities is made more difficult both by the traffic-boosting incentives in place to permit it to continue, as well as by the ease with which targeted offenders can escape their wrath and continue to post content.
No one is more emblematic of that problem than conspiracy-meister Alex Jones of Infowars, who was officially banned from YouTube and Facebook last August. Even though Jones had a long and horrific track record with his videos, the lawsuit filed by the parents of Sandy Hook victims plagued by Infowars followers made clear the potential liability that every platform that hosted his work faced.
Jones has not been easy to disappear, however. His Infowars content has been reposted by a number of mirror sites that have been eventually removed—one as recently as just after the attacks in Christchurch. In spite of this, Media Matters notes: “Channels that violate YouTube’s rules by exclusively sharing Infowars content are easily found on YouTube, but the video platform doesn’t appear to be devoting many resources to enforcing its own rules.”
Indeed, as Bergen reported, YouTube very nearly installed a remuneration system for its video creators in 2017 that would have made Jones the site’s highest-paid contributor.
The top priority at YouTube, as the Bloomberg story explains, is “Engagement,” getting people to come to the site and remain there, accumulated in data as views, time spent viewing, and interactions. Moderating extremist content is often devalued if it interferes with the company’s main goals.
The company announced early in 2019 that it intended to crack down on the conspiracism. However, part of its problem is that YouTube in fact created a huge market for these crackpot and often harmful theories by unleashing an unprecedented boom in conspiracism. And that same market is where it now makes its living.
The formula for success that emerged over time at YouTube is simple: “Outrage equals attention.” Brittan Heller, a fellow at Harvard University’s Carr Center, observed that it’s also ripe for exploitation by political extremists and hucksters. “They don’t know how the algorithm works,” she said. “But they do know that the more outrageous the content is, the more views.”
And the more views, the more money these platforms will roll in. Hate and division become the fuels for profit in this system. It’s a recipe for cultural disaster.