YouTube plans to add more people next year to identify inappropriate content as the company responds to criticism over extremist, violent and disturbing videos and comments.
Youtube says human reviewers remain essential to both removing content and training machine learning systems because human judgment is critical to making contextualized decisions on content. Since June, Youtube's safety teams have manually reviewed nearly 2 million videos for violent extremist content, helping train Youtube's machine-learning technology to identify similar videos in the future. Youtube is also taking aggressive action on comments, launching new comment moderation tools and in some cases shutting down comments altogether.
Google plans to continue the growth of its teams into next year, with the goal of bringing the total number of people across Google working to address content that might violate Youtube'spolicies to over 10,000 in 2018.
The moves come as advertisers, regulators and advocacy groups express ongoing concern over whether YouTube's policing of its service is sufficient.
YouTube is reviewing its advertising offerings as part of response and it teased that its next efforts could be further changing requirements to share in ad revenue.