Facebook is Expanding Efforts to Protect Elections in 2019
Facebook is tightening the rules and safeguards around political adverts to prevent foreign interference in elections, including those in Europe this year.
In advance of the European Parliament election, in late March Faceook said it would launch additional tools in the EU to help prevent foreign interference and make political and issue advertising on Facebook more transparent. Advertisers will need to be authorized to purchase political ads; Facebook will give people more information about ads related to politics and issues; and will create a publicly searchable library of these ads for up to seven years. The library will include information on the range of the ads’ budget, number of people they reached and demographics of who saw the ad, including age, gender and location. Facebook will also launch these transparency tools for electoral ads in India in February and in Ukraine and Israel before their elections, with a global expansion before the end of June.
Facebook is also planning to set up two new regional operations centers, focused on election integrity, located in its Dublin and Singapore offices. This will allow Faceboo's global teams to better work across regions in the run-up to elections, and will further strengthen the company's coordination and response time between staff in Menlo Park and in-country. These teams will add a layer of defense against fake news, hate speech and voter suppression, and will work cross-functionally with our threat intelligence, data science, engineering, research, community operations, legal and other teams.
Facebook says it has more than 30,000 people working on safety and security across the company. The compayn claims it has also also improved our machine learning capabilities, which allows the social network to be more efficient and effective in finding and removing violating behavior.
Regarding fake news, Facebook follows a three-part framework to improve the quality and authenticity of stories. First, the company removes content that violates its Community Standards. Then, for content that does not directly violate the Community Standards, but still undermines the authenticity of the platform — like clickbait or sensational material — Facebook reduces its distribution in News Feed, so less people see it. Finally, Facebook informs people by giving them more context on the information they see in News Feed.
The world’s largest social network has faced pressure from regulators and the public after last year’s revelation that British consultancy Cambridge Analytica had improperly acquired data on millions of U.S. users to target election advertising.