YouTube's algorithms and radicalization of viewers




An associate professor at the School of Information and Library Science at the University of North Carolina, Zynep Tufekci, wrote on The New York Times' March 12 edition, titled, YouTube the great radicaliser,  his mini-experiments of creating YouTube accounts on various topics, including right-wing, left-wing to non-political one. 

He concluded that YouTube’s recommendation algorithm would feed users in a manner that appears to constantly up the stakes. YouTube radicalizes its users, even though, “it is not because of a cabal of YouTube engineers is plotting to drive the world off a cliff. A more likely explanation has to do with the nexus of artificial intelligence and Google’s business model,” he writes. 

People are lured to content that is more extreme than what they started with — or to incendiary content in general. 

Tufekci also introduced the reader to ex-Google engineer Guillaume Chaslot who was dismissed in 2013 for cited poor performance yet he maintained the real reason was that he pushed too hard for changes in how the company handles such issues. 

It is Chaslot who later helps the Wall Street Journal to conduct an investigation of YouTube content. The result found that the platform often “fed far-right or far-left videos to users who watched relatively mainstream news sources,” and that such extremist tendencies were evident with a wide variety of material. 

It is possible that YouTube’s recommender algorithm has a bias toward inflammatory content. Chaslot’s experiment discovered that whether you started with a pro-Clinton or pro-Trump video on YouTube, you were many times more likely to end up with a pro-Trump video recommended. 

In short, what we are witnessing is the computational exploitation of a natural human desire, to dig deeper, find secrets that engage us, which eventually leads us to a rabbit hole of extremism, while Google racks up the ad sales. 

This situation is dangerous given how many people, especially young people, turn to YouTube for information. This state of affairs is unacceptable but not inevitable. Tufekci concludes with “There is no reason to let a company make so much money while potentially helping to radicalize billions of people, reaping the financial benefits while asking society to bear so many of the costs.”

Said and done. 

by Damar Harsanto


Previous Post Next Post