7/26/2023 0 Comments Find the rabbit twitter![]() Why is studying YouTube’s recommendation algorithm difficult?įor starters, the recommendation algorithm is highly personalized, meaning one individual’s experience on YouTube can be completely different from another’s. Understanding what content YouTube recommends to users-and the extent to which YouTube recommends various types of harmful content-is both an important and challenging problem to solve. Taken together, these statistics suggest that YouTube’s recommendation algorithm is vitally important for news consumption. In addition, YouTube’s recommendation algorithm drives around 70% of total views on the platform. Twenty-two percent of, or roughly 55 million, Americans, also report regularly getting news on YouTube. YouTube is the second-most visited domain on the internet, just behind Google, part of their parent company. In 2021, 81% of American adults reported using YouTube, compared to 69% who use Facebook and 23% who use Twitter. ![]() By many measures, YouTube is the largest social media platform in the United States. Even before Facebook, Twitter, Reddit, or other platforms implemented algorithmically-generated user feeds, YouTube was providing users with recommended videos to watch next. YouTube, started in 2005 and acquired by Google in 2006, has grown to prominence as the internet’s archive for video content. While this can be harmless or even beneficial in areas like sports or music, in areas such as news content, health content, and others, this type of personalization could lead to harmful societal outcomes, such as siloing individuals into anti-vaccine, extremist, or anti-democratic echo chambers.Īnton Korinek Wednesday, December 8, 2021 Many believe that social media algorithms exacerbate this problem by suggesting content to users that they will enjoy. 1 One prominent concern is that our rapidly evolving information environment has increased the number of ideological news outlets and made it easier for individuals to exist in “echo chambers” where they’re rarely confronted with alternative perspectives. While few claim that social media actually is the root cause of political polarization, many worry that the affordances of social media are accelerating the more recent rise in political polarization. presidential elections (although not the concurrent legislative elections) were riddled with fraud, culminating in the January 6 Capitol attacks, while Democrats largely accepted the results as legitimate. Even more dramatically, many Republicans claimed that the 2020 U.S. First, the nation’s response to COVID-19: Preventative measures such as mask wearing and vaccination became inextricably linked to partisanship. We’ve seen two prime examples in the past two years. Americans are more willing to condone violence, less open to relationships that cut across party lines, and more prone to partisan motivated reasoning. YouTube’s Recommendation Engine: Why Is It Important?īy many measures, mass polarization is on the rise in the United States. In the remainder of this article, we lay out exactly why such research is important, how we did our research, and how we came to these conclusions. We also find that, on average, the YouTube recommendation algorithm pulls users slightly to the right of the political spectrum, which we believe is a novel finding. In our study, which utilizes a new methodological approach that makes it easier for us to isolate the impact of the YouTube recommendation algorithm than previous work, we found that YouTube’s recommendation algorithm does not lead the vast majority of users down extremist rabbit holes, although it does push users into increasingly narrow ideological ranges of content in what we might call evidence of a (very) mild ideological echo chamber. Meanwhile, other studies have shown that YouTube, on average, recommends mostly mainstream media content. ![]() Multiple media stories have posited that YouTube’s recommendation algorithm leads people to extreme content. ![]() In a new working paper, we analyze the ideological content promoted by YouTube’s recommendation algorithm. By extension, then, it is critical to understand how the algorithms that generate our feeds shape the information we see. ![]() Social media content feeds are crucial to media consumption today. On the other hand, disclosures from the Facebook Files last fall suggested that adjustments to Facebook’s algorithm amplified angry and polarizing content and may have helped foment the January 6 insurrection. On the one hand, most empirical research has found that user behavior, not recommendation algorithms, largely determines what we see online, and two recent studies disputed Musk’s claim of anti-conservative bias. Before Elon Musk entered the fray, however, a growing body of journalistic work and academic scholarship had begun to scrutinize the impact of social media platform algorithms on the type of content people see. ![]()
0 Comments
Leave a Reply. |