Google-owned video streaming service YouTube saw a grand total of 8.3 million videos removed from its service in the three-month period ending December 31 and has published a transparency report showing more detailed information about how that happened. The report goes over how many videos were flagged by humans versus how many were flagged by YouTube’s automated systems, along with what flag category different numbers of videos fell into, and even a bit of information about how the whole flagging process works. There’s also a video, embedded below, that shows the process from a video being flagged to it being removed.
Flags generally fall into a number of categories, such as spam, hateful clips, and sexual content. For the reported period, the biggest category was sexual content, taking up about 30-percent of the total removals. Spam and misleading content was a close second with around 25-percent of flags over the same period. The first step in the flag and review process is for a video to actually be flagged. Most of the time, YouTube’s machine learning algorithms are able to catch potentially objectionable content, and can even remove it automatically in some cases. This happened to just over 75-percent of videos that were flagged and removed during this period. Some videos may slip by the machines for one reason or another, or need to be looked over by a human after being flagged.
There are some process deviations and side effects that can be part of the flagging and removal process. Extremely dangerous or egregious videos will get submitted to relevant government agencies in many cases, such as threats of violence going to law enforcement. Videos promoting extremism, very violent content that was obviously committed or filmed by the uploader, and other videos that may be evidence of a crime also end up in the hands of law enforcement when removed. Getting a video flagged and removed will normally result in a single strike for an uploader, where three will see their channel taken down. If the review team looks over a channel that posted a removed video and sees that the channel itself is in violation, it can be removed immediately. In either case, the uploader can appeal YouTube’s decision. The platform generally does not remove videos that fall under the categories of being artistic, scientific, educational, or for documentary purposes, but videos that have the sole purpose of shock value, copyright infringement, or other community guideline violations are not safe. Flagged videos that are safe are usually age-gated, demonetized, or both, depending on their content and intent. The entire process generally does not take a very long time from start to finish, though there can be cases where YouTube’s review team and an uploader may not see eye to eye in a dispute, leading to a longer, more drawn-out process that may or may not end in the uploader’s favor.