Facebook to Establish Independent Governance Team to Decide On Content Appeals
Facebook disclosed plans to create a new way for people to appeal content decisions to an independent body, whose decisions would be transparent and binding.
In an lengthy blog post, Facebook CEO Mark Zuckerberg said that the purpose of the new body would be to "uphold the principle of giving people a voice while also recognizing the reality of keeping people safe."
Zuckerber said that independence will prevent the concentration of too much decision-making within Facebook's teams, and will also create accountability and oversight. "It will provide assurance that these decisions are made in the best interests of our community and not for commercial reasons," he added.
Facebook is still in the early stages of defining how this will work in practice. Starting today, the company is beginning a consultation period decide how the members of the body will be selected and how the company will ensure their independence from Facebook. As part of this consultation period, Facebook will begin piloting these ideas in different regions of the world in the first half of 2019, with the aim of establishing this independent body by the end of the year.
"Over time, I believe this body will play an important role in our overall governance. Just as our board of directors is accountable to our shareholders, this body would be focused only on our community. Both are important, and I believe will help us serve everyone better over the long term," Zuckerberg added.
Facebook CEO also talked about his company's responsibility to keep people safe on the Facebook services -- whether from terrorism, bullying, misinformation or other threats. Besides having teams made up of around 30,000 for enforcing Facebook's policies, Facebook uses a combination of artificial intelligence doing the most repetitive work. The larger team of people focused on the more nuanced cases, since the AI is still not sufficient to handle these challenges on its own. Over the course of our three-year roadmap through the end of 2019, Zuckerberg expects to have trained Facebook's systems to proactively detect the vast majority of problematic content.
Facebook said here on Thursday it had identified about 2.1 million posts as being bullying or harassment in nature on the social network between April and September.
The company revealed the numbers in its second community standards enforcement report, in which the company introduced a new category of data with details on bullying or harassment posts.
Zuckerberg also talked about the need to cope with click-bait and misinformation on Facebook, by removing fake accounts that generate it and also reducing its distribution and virality.
He also asked called for right regulations that would be an important part of a full system of content governance and enforcement.
Zuckerberg added that the company is taking more drastic measures to reduce the spread of sensational content on its site -- by tweaking the news feed algorithm.
"People naturally engage with more sensational content," Zuckerberg said. "What we see is that as content gets closer to the line of what is prohibited by our community standards, people seem to engage with it more." Facebook will now reduce distribution for that "borderline" content on the social network, Zuckerberg said.
"There is no single solution to these challenges, and these are not problems you ever fully fix. But we can improve our systems over time, as we've shown over the last two years. We will continue making progress as we increase the effectiveness of our proactive enforcement and develop a more open, independent, and rigorous policy-making process. And we will continue working to ensure that our services are a positive force for bringing people closer together," Zuckerberg added.