Facebook To Hire Thousands To Check Violent Videos
Over the next year, Facebook will be adding 3,000 people to its community operations team around the world -- on top of the 4,500 the company has today -- to review the millions of reports related to inappropriate video material.
These reviewers will also help Facebook get better at removing things the social network doesn't allow on Facebook like hate speech and child exploitation, Chief Executive Mark Zuckerberg said on Wednesday.
In addition to investing in more people, Facebook is also building better tools to keep the community safe. The company will make it simpler for users to report problems, faster for its reviewers to determine which posts violate Facebook's standards and easier for them to contact law enforcement if someone needs help.
Last week, a father in Thailand broadcast himself killing his daughter on Facebook Live, police said. After more than a day, and 370,000 views, Facebook removed the video. Other videos from places such as Chicago and Cleveland have also shocked viewers with their violence.
The world's largest social network, with 1.9 billion monthly users, has been turning to artificial intelligence to try to automate the process of finding pornography, violence and other potentially offensive material.
However, Facebook still relies largely on its users to report problematic material.