YouTube to Recommend Fewer Videos About Conspiracy Theories
YouTube said on Friday that it would reduce recommendations of borderline content and content that could misinform users in harmful ways—such as videos promoting a phony miracle cure for a serious illness, claiming the earth is flat.
The move comes amid sustained criticism that YouTube has pushed extremist videos for years.
Youtube said that the shift would apply to less than one percent of the content on YouTube. This will only affect recommendations of what videos to watch, not whether a video is available on YouTube. People can still be able to access all videos that comply with Youtube's Community Guidelines and, when relevant, these videos may appear in recommendations for channel subscribers and in search results.
"We think this change strikes a balance between maintaining a platform for free speech and living up to our responsibility to users. This change relies on a combination of machine learning and real people. We work with human evaluators and experts from all over the United States to help train the machine learning systems that generate recommendations," Youtube said.
This will be a gradual change and initially will only affect recommendations of a very small set of videos in the United States. Over time, as Youtube's systems become more accurate, the company will roll this change out to more countries.