It’s a new year, and no matter how many resolutions I make, they won’t change some of the problems and irritations from years past on the internet — namely trolls, keyboard racists, and misogynists. They are a very special breed who haunt the comments sections of blogs and boards, of news sites and book reviews, spewing vile and often violent screeds which tend to remain un-moderated and whose words are blazing beacons of bigotry.
The phrase “suffer fools gladly” is attributed to Saint Paul, and to “not suffer fools” according to the Cambridge Dictionary, is “to have very little patience with people who you think are stupid or have stupid ideas.” The trolls on the internet go beyond simply stupid: They have slimed over into dangerous and harmful territory. Yes, they are ubiquitous, and we have all been told repeatedly, since the early days of the net to “avoid feeding the trolls.” As a veteran of early bulletin boards and Usenet, I’ve lived through their birth and proliferation. That history can be found in Ashley Feinberg’s piece, “The Birth of the Internet Troll,” in which she concludes:
Clearly, for as long as the internet has been around, trolls have existed in some form—whether they were called that or not. There will always be agitators. There will always be people who want upset others. That's not going to change.
What we can change, though, is how we approach these situations in all their varied forms. Which, according to Phillips, "depends on whose voices platform administrators, advertisers, and other people on the business end choose to privilege—the targets of abusive, intimidating behaviors or those who are doing the intimidating." It's not an issue of "feeding the trolls" (a problematic phrase in its own right), but rather whether or not we're going to stop giving a platform to the trolls, the aggressors, and the antagonizers. Whether it be by not validating their behavior with concessions or dropping the catch-all term "troll" in favor of more accurate terminology—be it misogynist, sociopath, or straight-up dick.
So yes, assholes have and will always be around, as will their unfortunate victims. It's just a matter of who we let hold the megaphone.
This isn’t the first time I’ve addressed this issue. In “
Fighting racism … one keystroke at a time,” I talked of the need to push back, and not simply ignore the comments, closing with, “A few keystrokes a day can drive racism away." There is a need to expand on individual efforts and address the platforms that have given hate a license. This becomes even more imperative during an election year in which the Republican contenders are falling all over themselves to emulate their bigoted frontrunner in upping the ante on inflammatory rhetoric.
In Feinberg's piece, she links to Whitney Phillips’ “
Don't feed the trolls? It's not that simple,” part of Phillips’ series on “The Anti-social Web.” In “
Comment moderation and the (anti-) social Web,” Phillips takes on the oft-repeated defensive battle cry of “free speech” when hateful spew is confronted. After citing the actual text of the FIrst Amendment, she addresses the fact that the right for there to be comment moderation on privately held websites is not a violation of “free speech,” and goes on to confront the meme:
But it’s not the legal sense of the term that’s the problem; it’s the colloquial sense of the term—the idea that people should be able to say whatever they want on the Internet, even if, maybe even especially when, what they have to say is antagonistic or otherwise obnoxious, because … free speech (this argument is a textbook case of circular logic; for an example, see Storify’s CEO Xavier Damman’s reaction to several female users who had complained about on-site harassment).
Concurrent to this basic assertion—that people should be able to say whatever they want on the Internet, because people should be able to say whatever they want on the Internet—is the assumption that it’s everyone else’s job to sit back and deal with it, a line of reasoning that typically goes something like, well, if the things people say offend you so much, then don’t read the comments section, or don’t come online at all. And if you can’t handle a little heat, then don’t bother commenting. It’s a person’s God-given right to fling poo all over the comments section; this is America. Also, welcome to the Internet.
The kind of speech most likely to be defended by this line of reasoning is speech that is bigoted and antagonistic, largely toward women and other historically underrepresented groups (note the infrequency with which women and people of color use the “...but but FREE SPEECH” defense in a debate, whether online or off-). Free speech in the colloquial Internet sense, particularly as it’s used in the context of comment moderation, almost always justifies or outright apologizes for a typically male, typically white aggressor. It is a concept that frames freedom in terms of being free to harass others, not freedom from being harassed, or simply from being exposed to harassment (which often amounts to the same thing).
Before anyone reading gets bent out of shape about her characterization of vile speech-spewers as white males of a certain type, I will state unequivocally that here at Daily Kos, there are many white males who regularly “flag” or “donut” vile speech. I agree with her pointing out that women, people of color, and LBGT internet users (and I will add Muslims and Jews to the list) are most often the targets of bigotry and hate. That translates into real-world meat-space data on who is actually physically harmed by haters. One need only add up the stats on domestic abuse, police abuse, rape, church burnings, mosque, and synagogue attacks to see that the spew online reflects what is a fact of life in the “real world.” Remember that the people sitting at a computer typing shit are actual live humans who may live next door to you, or work at the next desk.
Here’s an example of two comments I just read on an article about a recent news event (warning: hate speech). I’m not linking to them.
Lets all bow our heads in a moment of silence an pray that President Trump will add niggers to the list when he is deporting the mexicans.
The moran went on to post:
It should be legal to kill uppity niggers they are inbreeding intirely to much the past 50 years, women like this and men with pants hanging down to their knees have become the norm for our society. If any of you nigger slut whores had any idea who your own father was it would help you to not grow up and repeatedly get impregnated by him! The only hope left for America is the mandatory spaying of all buck negro at birth.
Can’t tell you how much of this vomitous stuff I encounter in a day’s reading. Lest you think this is simply a U.S. problem, our neighbor to the north, Canada, has an internet plague of racism targeting First Nations peoples:
When indigenous writer and teacher Chelsea Vowel reads the comments on her articles, she feels physically ill. “I have had people threaten to find out where I live,” the Montreal resident told the Guardian in an email. “I’ve seen people call for sterilization of indigenous people, suggest that people blocking roads in protest should be shot or run over and say that I should be raped and murdered.”
The 38-year-old mother and stepmother of five now publishes fewer articles that debunk myths about indigenous people because of the racist backlash: “It drains me, and makes me hurt.”
Online discussions of aboriginal issues in Canada can become so vitriolic that the Canadian Broadcasting Corporation (CBC) decided earlier this week to temporarily close comments on stories about indigenous people. Brodie Fenlon, acting director of digital news, said that while many topics incite problematic discussion, the number of comments that descend into hate speech and personal attacks are disproportionately higher on stories related to indigenous issues. The public broadcaster will review its moderation process and plans to reopen comments in mid-January. (Full disclosure: I edited Vowel’s blogs and worked with Fenlon at the Huffington Post Canada.)
In ”White Trolls in Canada: Don’t Read the Comments,“ Eternity Martis wrote:
With recent discussions about how trolls ruining the internet for the rest of us, debates on exposing the identities of trolls, and how their comments teeter between hate speech and free speech, online media is starting to make influential changes.
U.S. websites like Mic, Reuters, and Recode have eliminated their comment sections completely in the hopes of winning an ongoing struggle with moderating comments. Broadly, the new female-focused channel from Vice, launched without a comment section. The Toronto Star and the Globe and Mail have phased into closed comments on sensitive articles after years of begging for “respectful” and “civil” online conversations with tremendous fail. On September 9, 2015, the National Post delivered a blow to online trolls by requiring them to sign into Facebook before posting, hoping the lack of anonymity will deter hate speech.
In “Policing The Trolls: The Ins and Outs of Comment Moderation,” National Public Radio (NPR) discusses its policies here in the U.S.
The majority of NPR comment sections are not moderated by NPR staff. The exception to this is Code Switch. This blog is heavily curated in-house in order to keep discussion civil on the often sensitive subject of race. For guidelines on how the blog is moderated, check out this post by Gene Demby on "The Four Types of Comments We Usually Remove On Code Switch."
Social media platforms are getting push back. Yik Yak, described as “a social media smart phone application,” has come under fire.
One of the biggest criticisms of social media sites and applications is their inherent potential to feed the growing amount of cyberbullying. Due to the widespread bullying and harassment committed through Yik Yak, many schools and school districts have taken action to ban the app. These include several Chicago school districts, Norwich University in Vermont, Eanes Independent School District in Texas, Lincoln High School district in Rhode Island, New Richmond School District in Ohio, Shawnigan Lake School in Canada and Pueblo County School District in Colorado. Tatum High School in New Mexico banned cell phone use from the school due to Yik Yak, and the Student Government Association at Emory University in Georgia attempted to ban the app across campus, but failed to do so after immense backlash from students.
On May 13, 2015, Santa Clara University President Father Engh released a statement to all students after several racist remarks were posted on Yik Yak. He writes, “Hate speech, not to be confused with free speech, has no place at Santa Clara University, because it violates the dignity and respect with which each member of our community is entitled to be treated. Hurtful comments directed at individuals or groups diminish us all and create a divisive atmosphere of distrust and suspicion.” This highlights the ethical controversy of cyberbullying and racism within the social media app.
At Lewis & Clark College, students protested threats posted on Yik Yak:
After an anonymous person posted threatening racist comments on Yik Yak Tuesday, students at Lewis & Clark College in Portland, Ore., rallied on campus grounds Wednesday morning, both in solidarity with black students who feel unsafe, and to demand the administration take action.
About 200 students huddled on a main thoroughfare between the library and a key academic building, forcing people to step over them to get to class.
The informal group of students who organized the rally encouraged others to skip class and also led a discussion about racism on campus.
The Yik Yak commenter wrote: “black people think just cause their ancestors are slaves they deserve more…Well in that case, let’s give them something to whine about #Bringbackslavery,” and, “I just want to hang you ignorant black people.”
Back in November 2015, a suspect was arrested for allegedly posted racist death threats on Yik Yak against black Americans on the University of Missouri campus.
The Washington Post reported protests at American University in D.C.:
Some students at American University got so fed up with the racist comments they were reading on social media that they decided to spread them.
They launched an online campaign, #TheRealAU, to blast out the racism they see, in hopes it will make it more difficult to ignore. They have been posting and sharing screenshots of slurs. They plastered them on the school’s front gates. And they are demanding that the administration do something.
This was in October 2015, but students have been speaking out for months. This news report is from May 2015:
Recent events at the University of Oklahoma, University of Maryland, and the University of Virginia has started widespread protests about race.
American University students are speaking out about recent racist comments made on the social media app Yik Yak and calling for action on behalf of the school community
Women are a major target of online trolling, bullying, and death threats. In “This Is What Happens When You Report Online Harassment to the Police,” Julie Zellinger points out the obstacles to getting any satisfaction from police agencies. She also addresses the ways in which the constant barrage of hate and real-world stalking drives women off the internet.
More than trolling: What happens on the Internet undeniably affects the "real" world, especially in that harassment disproportionately disadvantages and silences some populations — namely women, LGBT individuals and people of color — more than others.
"We have watched people slowly leave the space of trying to make change in the world because of online harassment," May said. It's a point that others have echoed, including journalist Michelle Goldberg, who earlier this year explored the phenomenon of feminist writers being driven off the Internet altogether by harassment for the Washington Post.
This is why, May said, it's so important to "come up with some ways to address this that's more than just telling people to 'not feed the trolls' or 'get a thicker skin.'"
She also links to Twitter CEO Dick Costolo’s admission that the company has failed to deal well with the issue.
Twitter CEO Dick Costolo is taking personal responsibility for his platform's chronic problems with harassment and abuse, telling employees that he is embarrassed for the company's failures and would soon be taking stronger action to eliminate trolls. He said problems with trolls are driving away the company's users. "We suck at dealing with abuse and trolls on the platform and we've sucked at it for years," Costolo wrote in an internal memo obtained by The Verge. "It's no secret and the rest of the world talks about it every day. We lose core user after core user by not addressing simple trolling issues that they face every day."
Whether it’s bullying, bigotry, or hate speech online, the problem is a global one. The Council of Europe launched a “No Hate Speech Movement” in 2013.
Canada has the Stop Racism and Hate Collective
There are many groups and websites where people are speaking out and fighting back. Make a resolution for 2016 to help stop the hate—online and off.
Ignoring it won’t make it go away.