'Fiction is outperforming reality': how YouTube's algorithm distorts truth | Technology
<https://www.theguardian.com/technology/2018/feb/02/how-youtubes-algorithm-di...> [...] The software Chaslot wrote was designed to provide the world’s first window into YouTube’s opaque recommendation engine. The program simulates the behaviour of a user who starts on one video and then follows the chain of recommended videos – much as I did after watching the Logan Paul video – tracking data along the way. It finds videos through a word search, selecting a “seed” video to begin with, and recording several layers of videos that YouTube recommends in the “up next” column. It does so with no viewing history, ensuring the videos being detected are YouTube’s generic recommendations, rather than videos personalised to a user. And it repeats the process thousands of times, accumulating layers of data about YouTube recommendations to build up a picture of the algorithm’s preferences. Over the last 18 months, Chaslot has used the program to explore bias in YouTube content promoted during the French, British and German elections, global warming and mass shootings, and published his findings on his website, Algotransparency.com. Each study finds something different, but the research suggests YouTube systematically amplifies videos that are divisive, sensational and conspiratorial. [...]
participants (1)
-
Alberto Cammozzo