X

YouTube Finds Some Success Reducing Conspiracy Recommendations

YouTube has reportedly discovered a modicum of success in its goal to bring an end to conspiracy theories perpetuated via recommendations on its streaming platform. That’s based on a recent study conducted by researchers at Berkley. The researchers reviewed 8-million recommendations covering a 15-month period.

In particular, YouTube has been taking a harder stance on videos across two categories. Those are conspiracy theories that claim 9/11 was a US government-sponsored inside job and flat earth conspiracies.

Between June and December of last year, the study found that the percentage of conspiracy theory recommendations dropped first by 50 percent before dropping by 70 percent. The latter figure represents the largest drop in the study.

Although there was some fluctuation in the overall results over time, the researchers say recommendations of that content is down. Specifically, it’s 40-percent less common now than when the study started. That’s with consideration for additional factors, such as the popularity of the source video.

YouTube’s crackdown is ongoing

YouTube has cracked down on various content over the years, from music to pranks or challenges it deems dangerous. The above-mentioned crackdown on conspiracy theories is among the more ambitious of the company’s efforts. But that has specifically tackled the types of conspiracies outlined above, and others that might garner bad publicity.

The platform still has plenty of other issues that need to be dealt with. For instance, while conspiracies surrounding flat earth, 9/11, or false flag attacks are diminishing, others remain untouched. The study-related reports indicate other conspiracy theories regarding aliens building pyramids and climate change denial are still going strong.

YouTube’s work here is obviously far from done if it hopes to address every conspiracy theory or other factually incorrect assertions.

There is still plenty of wiggle room on these results

Setting aside conspiracy theories that are still alive and well via YouTube recommendations, the company may actually not be doing quite as well as the figures suggest either. To begin with, the methodology of the study leaves plenty of room for error.

The researchers gauged whether or not a conspiracy theory exists in recommendations for YouTube using an algorithm. That is based on the video’s description, comments, and transcripts. So the accuracy is largely dependent on how well the algorithm recognized the contextual meaning of terms and words used.

The study also didn’t utilize the YouTube platform in a way that allowed for personalized recommendations. Namely, they didn’t sign in. When users are signed into YouTube, the recommendations are often based on what they’ve already watched. So, in cases where conspiracy theories are already watched by a user on a regular basis, those may or may not still surface without any reduction at all.

It’s not immediately clear how different the study’s results would be, with consideration for those extra factors.

It may be that the Google-owned company’s take on conspiracy theories is working as well for personalized recommendations as users that aren’t logged in. It may also be the case that those aren’t addressed by YouTube’s efforts. Conversely, it may be that the way conspiracy theories were categorized leaves too much room for interpretation.