Back in August, Facebook announced it intended to ban far-right militant, anarchist, and QAnon groups from its platform. Two months later, TTP was able to find 45 pages and eight groups associated with right-wing extremist groups. “At least 53 Facebook militia pages and groups are still active on the platform. Some of them even have the word ‘militia’ in their name,” the report says.
The ads recruiting participants into various “Patriot” militias have run for at least the past couple of years, some of them reaching tens of thousands of users. They have continued running even after the so-called ban: “As recently as October, Facebook hosted an ad encouraging militias to attend a ‘freedom march’ in cities across the country just days before the election,” the report says.
The ad, which cost less than $100 and had the potential to reach between 500,000 and 1 million people, read: “We The People gather across America in a show of solidarity and demand emancipation from the bondage of tyranny. (Lawful carry & Militia strongly encouraged.)”
The ads in particular have been noteworthy:
As recently as mid-August, a Facebook ad for a group called New Mexico Light Foot issued a call for new members, expressing its allegiance to the 2nd Amendment with an image of an semi-automatic rifle. (“Light foot” refers to privately organized local militia battalions.) Another ad for America’s United Militia touted the group’s mission to “uphold and support the constitution,” adding, “Fight with us to take back America.” That ad had up to 45,000 “impressions,” indicating how many times it appeared on a screen.
The Facebook page for “Virginia Militia” ran a total of 61 ads. One of its last ads from February 2020 promoted a “Muster Call,” with the message, “Are you going to give up your rights or fight?” TTP also found an ad campaign for a Facebook page called “My Militia – American Patriot Community.” The page ran political ads in fall 2018 ahead of the midterm elections, with the message “The red wave is rising” and “It’s not revenge we are after, but a reckoning.” After the midterms, the group took out recruitment ads calling American militia men “the last hope of freedom” and urging users to “join your local militia today.”
These pages are far more than just basic organizing centers; they double as disinformation/propaganda outlets, as well as forums for extremist rhetoric, which often blatantly violates Facebook’s terms of service, but are rarely if ever removed. Numerous members of “Patriot” and pro-Trump Facebook pages have posted explicit threats to kill public officials and racial justice protesters, including Whitmer.
As Buzzfeed reported, Facebook—which was the organizing platform of choice for the Michigan militiamen who plotted to kidnap and kill Whitmer—responded quickly by telling journalists it had reached out to the FBI early in the investigation. Company spokespersons said it takes down content when it's reported to law enforcement, so long as there is a “credible threat of imminent harm to people or public safety.”
Yet even though Facebook officials told reporters that it had removed such Michigan militia groups as the Michigan Liberty Militia—two of whose members were arrested in the Whitmer plot—and the Michigan Militia Corps, TTP nonetheless found that same week a number of Michigan militia pages still active on Facebook, including another page for the Michigan Liberty Militia, operating under the barely altered moniker “MLM Michigan Liberty minutemen.”
Facebook’s seeming inability to contain the spread of far-right extremism is becoming a global issue. Its own internal investigation into the growth among its users of the far-right QAnon meta-conspiracy theories was not only immense—there are now thousands of QAnon groups on Facebook, with millions of members—but it was spreading beyond the United States to Europe and Australia.
Similarly, an earlier TTP report found that white supremacists were able to worm their way under its rules to maintain a toxic presence, including 113 of the 221 groups designated as hate groups by the Southern Poverty Law Center and the Anti-Defamation League. The platform’s algorithms, it said, worsened the problem by referring users to other white supremacist pages, amplifying the ideology along the way.
Facebook has responded with halting and inconsistently applied enforcement of its terms of service, which explicitly forbids threatening, violent, or hateful speech. Its initial response to the deluge of QAnon conspiracism on its platform was to ban a handful of pages and users who indulged in spreading the cult’s bizarre claims—not for spreading false smears, but for “coordinated inauthentic behavior,” i.e., because the people operating them broke Facebook’s rules about false or double identities.
The spread of white nationalist and other far-right ideologies has been a challenge for virtually every Internet-based company, including Google, YouTube, and Twitter. The greatest hurdle has involved getting the companies to recognize that reporting systems alone cannot slow this spread, and that moreover the algorithms that have helped make them profitable also play a powerful role in creating the closed feedback loops that heighten the cycles of radicalization that enhance far-right ideologues’ ability to spread propaganda.