Two weeks I wrote about The Great Hack and the manipulation of ours and other country’s electorate via collaboration between Cambridge Analytica and Facebook. Last week was about something more inspiring and hopeful, but this week I’m back to the dark side with a diary about the far-right tilt in YouTube videos.
As our way too long presidential election continues we should have our eyes wide open about what’s happening. Although we knew something different was happening 4 years ago, I don’t think we really understood the depth of it; and our response was to turn to private groups and do what we could to swat at deliberate misinformation.
I want us to be ready for the assaults, with information we can use and share with others. I think we’ve done a better job of pushing back and calling the attacks what they are. We’ve blocked, reported and unfollowed/unfriended those pushing it. Let’s keep at it and do what we can to fight the attacks.
******
Yesterday Joker included an excerpt from Trumplandia about a viral video supporting the traitor. It highlights the power and spread of viral videos.
It’s not surprising that clicking on a video link to that video would lead you to other anti-Democratic Party videos, or pro-traitor videos.
However, as this article from the NYT, it’s not just because someone started their search with a pro-traitor video that they are led to other right wing videos.
And this one from The Guardian show, even starting from a neutral search in YouTube will likely tilt you to far right videos via their "up next” autoplay queue.
Let’s look first at the NYT story. Its focus is on Brazilian politics, the election of Jair Bolsonaro through the indoctrination of a young Brazilian musician, Matheus Dominguez.
YouTube had recently installed a powerful new artificial intelligence system that learned from user behavior and paired videos with recommendations for others. One day, it directed him to an amateur guitar teacher named Nando Moura, who had gained a wide following by posting videos about heavy metal, video games and, most of all, politics.
In colorful and paranoid far-right rants, Mr. Moura accused feminists, teachers and mainstream politicians of waging vast conspiracies. Mr. Dominguez was hooked.
The NYT’s conclusion is that YouTube, “systematically” pushes viewers to far right and conspiracy content. The article includes the results of a research team from Taiwan that confirms the prominent role that the YouTube algorithm played in the Brazilian presidential election. Bolsonaro and the other Brazilian politicians that maximized the potential of YouTube in their campaigns are still using it, much like our president uses Twitter — to troll, shock, and distract.
As we saw with Cambridge Analytica, success depends on the same emotions. And like the Facebook ads, it becomes self looping; watch one of these videos, or “Like” a FB meme, you get more of the same.
But the emotions that draw people in — like fear, doubt and anger — are often central features of conspiracy theories, and in particular, experts say, of right-wing extremism.
As the system suggests more provocative videos to keep users watching, it can direct them toward extreme content they might otherwise never find. And it is designed to lead users to new topics to pique new interest — a boon for channels like Mr. Moura’s that use pop culture as a gateway to far-right ideas.
emphasis mine
Especially damaging is the effect on young people in Brazil, and likely everywhere where the content is unrestricted. One student quoted in the article says that watching the pushed videos were his, “political education” and said it was how young people got most of their information.
Again, we see parallels in this country with actual fake news and YouTube, versus reliable sources of information, with eerily similar themes — vaccines, feminism and homosexuality.
The Guardian article is from February, 2018 but still informative. Its reporting is based on interviews with a former YouTube engineer, Guillaume Chaslot.
Company insiders tell me the algorithm is the single most important engine of YouTube’s growth. In one of the few public explanations of how the formula works – an academic paper that sketches the algorithm’s deep neural networks, crunching a vast pool of data about videos and the people who watch them – YouTube engineers describe it as one of the “largest scale and most sophisticated industrial recommendation systems in existence”.
In the summer of 2016, Chaslot wrote a program that he would use to research the bias in YouTube’s algorithms. Here’s the basic design:
It finds videos through a word search, selecting a “seed” video to begin with, and recording several layers of videos that YouTube recommends in the “up next” column. It does so with no viewing history, ensuring the videos being detected are YouTube’s generic recommendations, rather than videos personalised to a user. And it repeats the process thousands of times, accumulating layers of data about YouTube recommendations to build up a picture of the algorithm’s preferences.
He used the program to research elections in France, Germany, and the UK. He found what we now know to expect — that the algorithm pushed viewers to conspiracy and far right content. This was the case with something simple like, “who is Michele Obama?”
What was YouTube’s role in our 2016 presidential election? Chaslot uncovered that in his research, and it’s disturbing, especially combined with the Facebook/Cambridge Analytica psychological experiment. His conclusion was that whether you started you search with “Clinton” or “Trump” you ultimately were pushed into a pro-Trump direction.
The author of the Guardian article reviewed a portion of the videos in Chaslot’s database, and quantified what they saw,
The sample we had looked at suggested Chaslot’s conclusion was correct: YouTube was six times more likely to recommend videos that aided Trump than his adversary. YouTube presumably never programmed its algorithm to benefit one candidate over another. But based on this evidence, at least, that is exactly what happened.
emphasis mine
And the impact in an extremely close Electoral College election like we had in 2016?
Even a small bias in the videos would have been significant. “Algorithms that shape the content we see can have a lot of impact, particularly on people who have not made up their mind,” says Luciano Floridi, a professor at the University of Oxford’s Digital Ethics Lab, who studies the ethics of artificial intelligence. “Gentle, implicit, quiet nudging can over time edge us toward choices we might not have otherwise made.”
I think Christopher Wylie was correct in The Great Hack in how he described the Cambridge Analytica/Facebook work, ”You’re playing with the psychology of an entire country without their consent or awareness.”
There’s a lot more to say about this subject, but I suggest taking the time to read the two articles.
I’m going to close with these two quotes from Zynep Tufekci, a social media scholar, one from each article.
First from the New York Times, where he’s discussing YouTube and calls it, “one of the most powerful radicalizing instruments of the 21st century.”
“YouTube is the most overlooked story of 2016,” Tufekci, tweeted in October 2017. “Its search and recommender algorithms are misinformation engines.”