YouTube has launched an investigation into some recent disturbing search result phrases that were automatically completed with the site’s autofill feature, turning out word suggestions that contain sexually explicit words involving minors. The video sharing service has since overhauled its autofill tool and also removed the upsetting autocomplete search results as soon as the issue was brought to the company’s attention, according to BBC.
The issue at hand involves search phrases that start with the words “how to have,” which the autofill feature of YouTube completed with word suggestions that vary in style, but generally borders on filling out those suggestions of a sexual nature that revolve around kids or family members. It is still not clear how those word combinations came to be or who was responsible for the apparently gamed search keyword recommendations, although some industry watchers believe there have been bold attempts to troll the video sharing platform by abusing its algorithm in a nasty way. It is also currently not clear how some of the phrases became widespread, though one of the possible ways for such word combinations to appear in YouTube’s autofill suggestion is for a huge number of site users to deliberately type those phrases in the platform’s search field. Nonetheless, the search results did not display videos showing instances of child abuse.
At the moment, nothing could be confirmed about the reason and methodology used to populate YouTube with disturbing search results. However, some industry watchers speculate that there might be some sort of coordinated efforts to put the site to shame. The alarming autofill results showed up on the site a few days after YouTube started to crack down on a series of inappropriate videos that have been targeting kids in recent times. The actions YouTube has taken in that regard are meant to demonetize those videos, ban their creators, probe possible cases of child abuse in collaboration with law enforcers, and control the inappropriate comments associated with those kinds of videos. Last June, the Google-owned website also removed ads from what it described as demeaning, hateful, and inappropriate videos that include violence and sexual content in an effort to stop those creators from making money on the site. YouTube’s crackdown on abusive content has been going on for years now, and it is likely to continue as more inappropriate videos pop on the platform every now and then.