Facebook Outlines Online Content Regulation Plan
In a white paper published Monday, Facebook detailed its push for internet regulation, calling on lawmakers to devise rules around harmful content, a different model for platforms’ legal liability and a “new type of regulator” to oversee enforcement.
Governments, academics and others are debating how to hold internet platforms accountable, particularly in their efforts to keep people safe and protect fundamental rights like freedom of expression.
Last year, Facebook CEO Mark Zuckerberg called for governments to work with online platforms to create and adopt new regulation for online content, noting, “It’s impossible to remove all harmful content from the Internet, but when people use dozens of different sharing services — all with their own policies and processes — we need a more standardized approach.”
Today, Facebook published a white paper setting out some questions that regulation of online content might address. “Charting a Way Forward: Online Content Regulation“ builds on recent developments on this topic, including legislative efforts and scholarship.
The paper poses four questions which go to the heart of the debate about regulating content online:
How can content regulation best achieve the goal of reducing harmful speech while preserving free expression? By requiring systems such as user-friendly channels for reporting content or external oversight of policies or enforcement decisions, and by requiring procedures such as periodic public reporting of enforcement data, Facebook says regulation could provide governments and individuals the information they need to accurately judge social media companies’ efforts.
How can regulations enhance the accountability of internet platforms? According to the paper, regulators could consider certain requirements for companies, such as publishing their content standards, consulting with stakeholders when making significant changes to standards, or creating a channel for users to appeal a company’s content removal or non-removal decision.
Should regulation require internet companies to meet certain performance targets? Companies could be incentivized to meet specific targets such as keeping the prevalence of violating content below some agreed threshold, Facebook says.
Should regulation define which “harmful content” should be prohibited on the internet? Laws restricting speech are generally implemented by law enforcement officials and the courts. Facebook says that internet content moderation is fundamentally different. "Governments should create rules to address this complexity — that recognize user preferences and the variation among internet services, can be enforced at scale, and allow for flexibility across language, trends and context," according to tge social media company.
The development of regulatory solutions should involve not just lawmakers, private companies and civil society, but also those who use online platforms. The following principles are based what Facebook has learned from its work in combating harmful content and our discussions with others.
Incentives. Ensuring accountability in companies’ content moderation systems and procedures will be the best way to create the incentives for companies to responsibly balance values like safety, privacy, and freedom of expression.
The global nature of the internet. Any national regulatory approach to addressing harmful content should respect the global scale of the internet and the value of cross-border communications. They should aim to increase interoperability among regulators and regulations.
Freedom of expression. In addition to complying with Article 19 of the ICCPR (and related guidance), regulators should consider the impacts of their decisions on freedom of expression.
Technology. Regulators should develop an understanding of the capabilities and limitations of technology in content moderation and allow internet companies the flexibility to innovate. An approach that works for one particular platform or type of content may be less effective (or even counterproductive) when applied elsewhere.
Proportionality and necessity. Regulators should take into account the severity and prevalence of the harmful content in question, its status in law, and the efforts already underway to address the content.
"If designed well, new frameworks for regulating harmful content can contribute to the internet’s continued success by articulating clear ways for government, companies, and civil society to share responsibilities and work together. Designed poorly, these efforts risk unintended consequences that might make people less safe online, stifle expression and slow innovation," said Monika Bickert, Vice President, Content Policy
Harmful content graphic, Facebook.
“If we don’t create standards that people feel are legitimate, they won’t trust institutions or technology,” Facebook’s Chief Executive Officer Mark Zuckerberg said in an op-ed in the Financial Times on Monday.
Facebook sees risks to innovation, freedom of expression
Facebook also warned of threats to innovation and freedom of expression on Monday, ahead of the release of a raft of rules by the European Union this week and in the coming months to rein in U.S. and Chinese tech companies.
The social media giant laid out its concerns ahead of a meeting of Chief Executive Mark Zuckerberg with EU antitrust chief Margrethe Vestager and EU industry chief Thierry Breton in Brussels on Monday.
Vestager and Breton are due to announce proposals on Wednesday aimed at exploiting the bloc’s trove of industrial data and challenging the dominance of Facebook, Google and Amazon.
They will also propose rules to govern the use of artificial intelligence especially in high risk sectors such as healthcare and transport. Other rules will be announced in the coming months.
Referring to the possibility that the EU may hold internet companies responsible for hate speech and other illegal speech published on their platforms, Facebook said this ignores the nature of the internet.
“Such liability would stifle innovation as well as individuals’ freedom of expression,” it said in its discussion document.
“Retrofitting the rules that regulate offline speech for the online world may be insufficient. Instead, new frameworks are needed.”