Once upon a time, we worried about the proliferation of “fake news” on Daily Kos, and instituted a moderation system that helped keep that tripe out of our site. It was a two-pronged system, depending on the able work of both you, the community, and a cadre of human moderators. It has worked splendidly.
But that system depended on the ability of everyone to easily identify suspect material. Given that we are a well-informed liberal site with a well-defined ideological identity, it made it particularly easy to spot the bad stuff.
But the world has evolved. The biggest danger in the information space is no longer fake news— people like Donald Trump or Fox News simply lying. Misinformation and disinformation is becoming increasingly sophisticated, and not just from the right. “Deep fakes” put all audio or video into question, as technology can be used to literally invent things that didn’t happen. Like the recent fake Pentagon bombing story, it’s becoming harder and harder to spot false content.
We are no longer worried about malicious actors trying to harm the site and its community. We’ve gotten pretty good at spotting and eliminating that. The danger now is that good-intended people unintentionally spread false information. The issue is exacerbated by confirmation bias, wanting to believe that Donald Trump or any of our roster of villains said or did that particularly crazy thing.
For reasons that should be obvious, we cannot let Daily Kos become a vector for that kind of false content. Our job is to inform and present information responsibly, rooted firmly as the “reality based community.” Anything that undermines that mission harms the site’s reputation, and your very own ability to be informed about what’s happening in the country.
As such, we are taking two major steps to safeguard the site and its community .
One of those changes is being more proactive in modifying community stories, removing or editing false content. In order to do so, we’ve tweaked our community’s foundational document, the Rules of the Road.
First of all, we bolstered the #3 rule in the “do” section:
3. Strive to be accurate. Use trustworthy sources. Take a moment to check your work, evaluate the source(s), and verify the facts to the extent possible. And if you find out you made a mistake, own it and correct it. If commenters are skeptical, take their concerns seriously. Don’t be a part of spreading misinformation, disinformation, or conspiracy theories.
This means that you shouldn’t uncritically accept something you see floating around. If it’s too good to be true, or particularly salacious, look for corroborating information. Understand that technology can now make anyone say or do anything, passing it off as real. We have to approach the news with caution and skepticism, and maybe a healthy dose of cynicism.
We’ve also updated the #6 rule in the “do not” section:
6: [Do not] Post misinformation or disinformation: Misinformation and disinformation present a threat not only to the site’s reputation but to political discourse in general. If a claim seems especially exciting or surprising, check for accuracy, look for corroborating and credible sources, and assess the plausibility of the argument before posting it, even if it fits a left-wing narrative. Be a wise and critical consumer of media, and correct your mistakes when necessary. Posting misinformation or disinformation can lead to administrative intervention, including an account ban.
I’ve already addressed the importance of critically evaluating information. But the latter half of that section is important—correct your mistakes when necessary, and do so quickly and immediately. Comments will be a first line of defense—our community is quick to point out factual errors. There’s no shame in noting that you screwed up. In fact, it’s honorable to hold the truth above all else. Correct your mistake, and put a note at the top explaining what happened. Help ensure other people don’t fall for the same dis- and misinformation.
We understand that people aren’t wedded to their stories and comments, and may not be around when false content lands. If that’s the case, when we come across a story with bad information, we will proactively edit the story and add an admin’s note at the top explaining what was changed, and why. We’ve debated adding a grace period before we intervene, but the dangers of letting any false content sit on Daily Kos, even for short periods of time, are too great.
This applies to headlines too. Deceptive or false headlines, even if they’re explained in the text of a story, can spread fast across the internet and mislead people about what is real.
The second major change we’ve instituted to better guard against false content is staff-facing—we’re all in this together, after all. Our newsroom has instituted a formal code of ethics to which all of our writers must adhere. We have a new editorial workflow that puts numerous sets of eyes on any story before it is published. We also more clearly defined our editorial mission statement, which has been added to our About Us page. I wrote more about it here. At Daily Kos we believe in continuous learning and our editorial staff train on the latest in journalistic ethics to stay current and uphold the highest standards of journalism.
We didn’t make these changes lightly. Changing our editorial workflow was a serious undertaking, upending the decades we had spent doing it the old way. We’ve similarly been historically loath to intrude into community stories except in the most extreme situations. But it’s now too easy for well-meaning people—even me!—to accidentally post bad information, and I don’t think anyone wants Daily Kos to facilitate that.
We’ll be reading your comments closely, to see if there’s anything we might’ve missed that requires adjusting.