X

YouTube Ignored Toxic Video Warnings

YouTube executives were warned about toxic videos, but decided that it preferred the engagement those videos were getting over simply removing them.

Executives decided not to alter the recommendations that are shown on every video on the site, and to get rid of those videos that would be toxic. Mainly those that talked about conspiracy theories for events like the Parkland Shooting last year, and others. But, leaders of the biggest video sharing site were more interested in keeping viewers watching. Seeing as that is how YouTube makes their money, they were putting revenue ahead of ethics.

According to this report out of Bloomberg, there were large groups of YouTube and Google employees that noted their concerns to executives, over the last few years. These employees also tracked the popularity of the toxic videos, to show management how big of a problem it actually was. But senior management reportedly decided to stay the course, so there wouldn’t be a dip in engagement metrics.

This was also an issue for many YouTube employees that left the company in the past year, who complained about YouTube not being able to tame these extreme and disturbing videos.

YouTube hasn’t been totally silent when it comes to removing extreme and conspiracy theorist videos, but it has been dragging its feet. It is using machine learning and artificial intelligence to help get rid of those videos before they actually appear on the site. Though that appears to be Google’s answer to everything. Using machine learning and/or artificial intelligence to solve the problem. These days, that appears to be causing more problems than it is solving, however.

YouTube has taken some action, this is why a number of videos have been demonetized over the past couple of years. Due to the fact that advertisers don’t want to see their ad on a video that has recommended videos for extremist-related videos. But, white nationalist and neo-Nazi propaganda videos are still available on YouTube in droves. So there is still work to be done here. Now YouTube has removed them from recommendations, added content warnings and removed advertising. Basically, they’ve done everything except removing them from the platform.

Unsurprisingly, YouTube is not alone in this regard. Twitter has also had trouble curbing content from white nationalists, neo-Nazis and even conspiracy theorists. It has been doing a little bit there, to make sure that it is not promoting that content, but the users spreading that content are still on the platform. This is something that all platforms are going to need to work on, and get rid of.

YouTube, Twitter and other platforms are a bit hesitant to remove content, as it does not want to lose users, or lose engagement from those users. So leaving controversial content up is seen as a good thing. As it gets more people engaging with the content and ultimately brings in more money for the platform. Since they all run on ads, and without that engagement, their ad prices would drop significantly.