X

Facebook To Add 3,000 Team Members To Help Find Bad Videos

Facebook’s Mark Zuckerberg had today announced that 3,000 people will be joining the company’s community operations team over the course of the next year. Specifics on the time-frame were not provided although by the close of the expansion, it seems the community operations team will have expanded from its current 4,500 members up to 7,500 members. The community operations team is tasked with helping Facebook deal with videos and content that is out of sync with the content that Facebook wants to see on its site. Examples of such content provided by Zuckerberg in today’s announcement include hate speech and child exploitation, although the list likely extends much further.

In fact, Facebook had recently found itself being criticized by the UK’s Home Affairs Select Committee, on how it deals with content deemed inappropriate. Noting how Facebook is not doing enough to not only combat the use of the site to promote certain viewpoints and content, but also not acting quickly enough to remove such content when it does surface. Today’s announcement seems to be a direct answer to such criticisms as the expansion in workforce of the community operations team is designed to ensure that “the millions of reports” on unsuitable content that Facebook receives each week are processed much quicker. However, it seems person-power is only one side of the equation as along with the announcement of the personnel expansion, Zuckerberg also confirmed that Facebook is working on building and developing “better tools” to help battle content that is not suitable for its community. These better tools will look to ensure that reporting of issues is not only quicker, but also more simple.

It seems these improvements are not only designed to stop the spread of select content, but also to ensure that Facebook is better equipped and faster to respond to users who find themselves in vulnerable situations. With Zuckerberg noting that the current system in place was already able to stop someone from committing suicide on Facebook Live. An aspect which Facebook hopes it will also get better at identifying and responded to – with investments in technology and people such as those announced today.