X

Legislation Proposed To Address Pedophilic Behavior On YouTube

A new bill has now been proposed by US Senator Josh Hawley that would fundamentally change the way at least some videos on YouTube and other services are surfaced. Specifically, the proposal has been introduced in a bid to make it more difficult for pedophiles to find content containing children. Dubbed the Protecting Children from Online Predators Act, the resulting legislation would effectively stop video content curators and sites from recommending any videos primarily featuring minors.

That would not apply to professionally produced content, with prime-time talent-show competitions being offered as an example. It also wouldn’t prohibit recommendations where the host or primary subject of the video is not a minor. Videos featuring children would still be discoverable via a search and uploadable by parents, channels, and other content creators.

However, the promotion or recommendation of videos that feature children as the primary subject would, if the bill passes into law, result in not only hefty fines but also criminal penalties for the hosting site in question.

What’s the story behind the legislative push?

According to a short description of the bill, shared by the Republican Missouri representative, the bill was inspired by an investigative report released by the New York Times. The findings of that report suggested that pedophiles were easily not only able to but actively abusing algorithms created by YouTube and other host sites to discover video content of scantily clad children.

In one instance, a mother who uploaded content of her children playing in a backyard pool, for instance, shot from no views to over 400,000 on YouTube in a matter of days. The underlying problem, the report concluded, was that YouTube’s algorithms were recommending the videos and others like it, based on the imagery shown, to others who had been watching sexually suggestive media.

Summarily, anybody who had previously searched for and watched videos of children in innocent but potentially compromising situations was able to then take advantage of nearly endless recommendations that YouTube surfaces based on watch history to find more with very little effort.

A widespread number of reports prior to the New York Times piece indicated that pedophiles had also been using comment sections on YouTube and elsewhere as guides to find more similar content easily.

Senator Hawley notes that YouTube had responded by beginning to make adjustments to its algorithms and already does not allow any users under the age of 13 to have channels of their own but says that isn’t enough. YouTube and other hosting sites should be putting the safety of children ahead of any potential monetary losses spurred by changes to protect them, the senator says.

This is only a partial solution to a very real problem

YouTube has, over the past several months and years, gained some notoriety as a service that doesn’t necessarily do a lot to avoid controversies. Some of those have been fairly innocuous while others have resulted in attempts to adjust its algorithms against surfacing bigotted and harmful content, often containing violence or threats.

One recent example of that is the Google-owned company’s decision to update its hate speech policies after videos of one widely-viewed YouTuber repeatedly attacking the sexual orientation of another, in a push that also included bans on neo-nazi and holocaust denial videos.

The legislation in question, while well-intentioned and potentially helpful, seems to serve as more of a bandaid fix than a real solution. Potential predators will still be able to search out the content in question and won’t necessarily even need to be logged in to find it, making it somewhat more difficult to track down who and where videos are being watched with ill-intent.

A fix from YouTube itself would likely only be equally useful, making finding the content inconvenient for predators rather than stopping the behavior altogether unless the company outright banned the uploading of content featuring children. A similar circumstance likely exists across a number of social media services well beyond video services.

While still better than no legislation, the solution at this juncture appears to remain in the hands of parents and content providers ensuring that uploaded video content is appropriate and kept safe.