Scatty.com

Does YouTube’s Algorithm Promote Extremism?

It may have happened to you when browsing the internet. You see a link to a video on a website like Reddit or Twitter and it leads you to watching a video about extreme ideals from one end of the spectrum or the other. At first, you might watch this extremist video out of spite or morbid curiosity, but before you know it, you’re being suggested very similar videos to the point where you may become indoctrinated into a way of thinking. This is because YouTube’s algorithm is built in a way where if you watch just one video on a particular subject, you’re bombarded with many on the same topic.

You tend to fall into what’s known as a channel cluster thanks to the YouTube algorithm, which funnels your suggestions into more specific topics. If you watched one video that criticized “SJWs” (social justice warriors), that leads to topics that start swaying to the right of the aisle. Suggested videos become more far right until you’re getting some very scary topics that are extremely racist and sexist. The same can be said for anti-capitalism videos and how it can lead to far left ideals such as forcefully tearing down the government. These rabbit holes were once reserved on YouTube for one cat video leading to an hour-long compilation playlist, but it has become very extreme politically in recent years.

This seems to be especially true when it comes to the far right of politics. Consumption of far right and ‘anti-woke’ content on YouTube – while small relative to politically moderate and nonpolitical content – is stickier and more engaging than other content categories,” says one study conducted by Stanford University, adding that “The growing engagement with radical content on YouTube may simply reflect a more general trend.”

With that said, the far right extremist content tends to generate more of a response, and thus, is more ingrained into the algorithm. If you see a video speaking about politics and there’s no bias in one direction, you’re likely to just say “meh” and move on to the next video. When you vehemently oppose something, you watch the entire video out of seething curiosity and react with dislikes, getting in comment wars, etc.

There’s a current hypothesis called algorithmic radicalization that has been studying this very phenomenon. Research began in 2021 with several major studies, and results are still being worked on to this day. One study has already noted that YouTube’s “Changes appear to have affected the propagation of some of the worst content on the platform, reducing both recommendations to conspiratorial content on the platform and sharing YouTube conspiracy videos on Twitter and Reddit.”

With this hypothesis, YouTube acts as the starting point for extremism where it hosts content and the algorithm catches people who have fallen down this “rabbit hole.” From there, the content starts to spread like wildfire across social media platforms, and for some people, it converts them into having these extremist views. Interestingly enough, though, the algorithm tends to capitalize on how people already felt or what their interests were. With that said, perhaps it’s not a YouTube problem to begin with, but rather people already having those types of seeds planted and YouTube is the symptom.

Leave a Reply

Your email address will not be published. Required fields are marked *