TikTok CEO Vanessa Pappas has reached out to leaders on other social media platforms to propose a collaboration to protect users from harm. More specifically, that’s from potentially extremely violent, graphic content, including suicide, reports indicate. The proposed solution is a “hashbank,” with Ms. Pappas released an open letter to her fellow executives via TikTok’s official blog.
So what exactly is the solution to content that causes harm via social media, according to TikTok?
In effect, the TikTok executive wants to protect against harm spread on social media by collaborating with other platforms on removing it. That’s through the “cooperative development of a Memorandum of Understanding.”
With that Memorandum, TikTok hopes to quickly identify and flag content that each individual platform deems harmful. The details pertaining to that would effectively be shared among participants. So, for example, TikTok, Facebook, YouTube, and other platforms might take part. If YouTube or TikTok flags content, the other platforms would be notified about the content.
Potentially, companies could send that notification so that it arrives before the content is pushed to the platform.
In that hypothetical scenario, the coalition of participants — via their apps or web presence — would then theoretically be able to stave off the content. Or at least to keep an eye out for it or copycat content so that it can be taken down when it arrives.
This is a very TikTok solution but might not ever show results
Now, since TikTok effectively works as a meme-generation and propagation platform — in short-form video form — this type of solution makes quite a bit of sense coming from the company.
Summarily, TikTok users frequently attempt replicate what they see others doing on the platform. And that, in turn, tends to result in the spread of that content to other platforms. Given TikTok’s meteoric rise over the past year, it’s not surprising that it’s the source of a lot of great content. But it likely sees quite a lot of negative or harmful content first as well.
So, at least in theory, the proposed solution could go a long way to contribute to the improvement of content on those platforms. The company has already worked with others to address content in a similar fashion before. That’s specifically applicable to content related to child sexual abuse.
But it appears just as likely that TikTok may never see the new proposal through to fruition. The company is under immense scrutiny in the US, in particular. That’s one of its largest markets. So, even if the coalition is formed, TikTok may not have as much influence there as hoped. Especially if some resolution isn’t reached with regard to US government demands. Namely, demands that TikTok parent company ByteDance sell off TikTok to a US buyer.