Facebook is failing the public—and the underlying principles of an open society, including the free exchange of ideas—on a multitude of fronts. The social-media giant has become the home, organizationally and (dis)informationally, of a broad menu of far-right extremists and their endless supply of frequently absurd conspiracism: QAnon, white supremacists, “Boogaloo Bois,” you name it.
The company routinely denies that this is the case, and insists it is taking steps to remove these cancerous fonts of false information—and yet, mysteriously, they not only continue to spread their toxic garbage, they’re growing. The example of the “Boogaloo” phenomenon—the civil-war-hungry gun aficionados whose violent rhetoric has already fueled various acts of domestic terrorism—and its presence on Facebook, as a recent report from the Tech Transparency Project explores in detail, suggests that its inaction is making a worrisome situation worse on a daily basis.
The report described Facebook’s response to the “Boogaloo” cult “slow and ineffective,” allowing “the movement to persist on its platform.”
“Facebook has repeatedly failed to remove “boogaloo” extremists who are using the platform to plan for a militant uprising—an alarming illustration of the company’s broader problems dealing with dangerous content,” the report says in its opening.
The company, the report found, “has consistently failed to spot boogaloo activity and missed boogaloo groups’ simple name changes designed to evade detection.” These problems—even in the face of political and media pressure to act—point to “deeper dysfunction policing its platform for things like hate speech and misinformation.”
It made five key findings:
- TTP identified 110 Facebook boogaloo groups that were created since June 30, when Facebook announced it was banning a “violent” boogaloo network. At least 18 of the groups were created on the same day as Facebook’s announcement.
- Many boogaloo groups have easily evaded Facebook’s crackdown by rebranding themselves, often co-opting the names of children’s movies, news organizations and even Mark Zuckerberg. Some of the newer groups already have more than 1,000 members.
- Material on bomb-making and other violent activity is continuing to circulate across boogaloo groups on Facebook. The content includes a Google Drive folder that contains dozens of instruction manuals for bomb making, kidnapping, and murder.
- Despite Facebook’s promise to stop recommending boogaloo groups to users, the company’s algorithms continue to suggest boogaloo-related groups and pages, even when they don’t use the word “boogaloo” in their names.
- Facebook has sought to justify its selective approach to removing boogaloo groups by arguing that some parts of the movement are not violent—even though the term “boogaloo” is synonymous with civil war.
As Tess Owen explores at Vice, these are not ordinary exchanges in a marketplace of free ideas, as the libertarians at Facebook seem to think. They involve both violent rhetoric aimed at a wide range of would-be targets, as well as open discussions of organizing violent attacks on behalf of the “Boogaloo.”
In Facebook groups with thousands of members, the would-be insurrectionists are exchanging and distributing a wide variety of documents aimed at organizing for the coming civil war: an “Al Qaeda kidnapping manual,” the “Army Sniper Manual,” reports on such bombings as the 2005 Islamist terror attack on London’s subways and buses, which killed 56.
Owen observes: “The distribution of entire folders containing instructions for violent acts is an escalation even compared to just a few months ago, when members were just pasting recipes for Molotov cocktails directly into Facebook groups.”
The volatility of the phenomenon is underscored by the way armed far-right “Patriots” were gulled into believing that hordes of black-clad leftists in George Soros-financed “antifa buses” were about to descend on scores of rural towns across the nation, provoking a wave of threatening and disturbing reactions—all because they had read about it on Facebook. In several cases, the hoaxes were spread by police, city officials, and other persons in positions of authority.
Part of what has confounded Facebook’s attempts at cracking down on the phenomenon has been the movement’s nimble manipulation of its own rules—suggested by its early adoption of mnemonic nicknames, code words such as “Big Igloo” and “Big Luau” to evade censorious detection on the platforms. As Facebook attempted to apply stricter algorithms to detect even these evasions, the movement quickly adapted.
Once moderators caught on to those codes, the groups adopted newer slang, such as “Alphabet Bois” (a reference to federal agencies like DHS, ATF, FBI, CIA), or just “[redacted]”. After Facebook’s action against the Boogaloos in June (it banned hundreds of Boogaloo groups and users), adherents reconvened on MeWe, an obscure app. There, they devised a plan to create new Facebook groups with new language—including references to CNN and VICE News.
“It’s a whole new theme for their movement,” TTP’s Kristin Paul told Owen. “They use words like ‘cameras and film’ instead of ‘guns and ammo’.”
Facebook’s seeming inability to contain the spread of far-right extremism is becoming a global issue. Its own internal investigation into the growth among its users of the far-right QAnon meta-conspiracy theories was not only immense—there are now thousands of QAnon groups on Facebook, with millions of members—but it was spreading beyond the United States to Europe and Australia.
Similarly, an earlier TTP report found that white supremacists were able to worm their way under its rules to maintain a toxic presence, including 113 of the 221 groups designated as hate groups by the Southern Poverty Law Center and the Anti-Defamation League. The platform’s algorithms, it said, worsened the problem by referring users to other white-supremacist pages, amplifying the ideology along the way.
Facebook has responded with halting and inconsistently applied enforcement of its terms of service, which explicitly forbids threatening, violent, or hateful speech. Its initial response to the deluge of QAnon conspiracism on its platform was to ban a handful of pages and users who indulged in spreading the cult’s bizarre claims—not for spreading false smears, but for “coordinated inauthentic behavior”, i.e., because the people operating them broke Facebook’s rules about false or double identities.
The spread of white nationalist and other far-right ideologies has been a challenge for virtually every Internet-based company, including Google, YouTube, and Twitter. The greatest hurdle has involved getting the companies to recognize that reporting systems alone cannot slow this spread, and that moreover the algorithms that have helped make them profitable also play a powerful role in creating the closed feedback loops that heighten the cycles of radicalization that enhance far-right ideologues’ ability to spread propaganda.
The only solution that works entails employing a large enough workforce to handle the human-based effort to weed out the toxic material, as well as one trained well enough in the nuances of the ideologies and their actors to both recognize them and to act accordingly. Several Internet platforms, including Twitter and Discord, have made substantial progress in dealing with the problem in this way.
Facebook, however, has not. And while the company may not face consequences in the United States for this failure—CEO Mark Zuckerberg, after all, has held private sessions with Donald Trump, and has plainly allied his company with the increasingly radical mainstream right—it will be a very different story for the company overseas, especially in nations that have joined New Zealand’s global campaign to reform social media to make it less hospitable to hatefulness and violence.
TTP’s Paul believes the company is fully capable of making the change—the question is whether or not it actually wants to do so.
“Facebook has a counterterrorism team of 350 people,” she told Owen. “These groups are not hard to find if you’ve been following the movement and understand what language they’re using. All you’d need is at least one dedicated person to make sure this movement stays deplatformed.”