Facebook, Google Face EU Fine if Extremist Content Stay Online For Over an Hour
Google, Facebook and Twitter must remove extremist content within an hour or face hefty fines, the European Commission's president has said.
"Europeans rightly expect their Union to keep them safe. This is why the Commission is today proposing new rules to get terrorist content off the web within one hour - the critical window in which the greatest damage is done," President Jean-Claude Juncker said.
Juncker announced new rules to get terrorist content off the web within one hour. The new rules are being presented one week ahead of the Informal Meeting in Salzburg where EU Leaders are expected to discuss security. Every internet platform that wants to offer its services in the European Union will be subject to clear rules to prevent their services from being misused to disseminate terrorist content. Strong safeguards will also be introduced to protect freedom of speech on the internet and ensure only terrorist content is targeted.
European countries (Member States) will have to put in place effective, proportionate and dissuasive penalties for not complying with orders to remove online terrorist content. In the event of systematic failures to remove such content within 1 hour following removal orders, a service provider could face financial penalties of up to 4% of its global turnover for the last business year.
According to the EC, in January 2018 alone, almost 700 new pieces of official Da'esh propaganda were disseminated online.
The Commission has already been working on a voluntary basis with a number of key stakeholders- including online platforms, Member States and Europol - under the EU Internet Forum in order to limit the presence of terrorist content online. In March, the Commission recommended a number of actions to be taken by companies and Member States to further step up this work. Whilst these efforts have brought positive results, overall progress has not been sufficient.
The new rules proposed by the Commission will help ensure terrorist content online is swiftly removed. The key features of the new rules are:
- The one-hour rule:Terrorist content is most harmful in the first hours after it appears online because of the speed at which it spreads. This is why the Commission is proposing a legally binding one-hour deadline for content to be removed following a removal order from national competent authorities;
A clear definition of terrorist content as material that incites or advocates committing terrorist offences, promotes the activities of a terrorist group or provides instruction in techniques for committing terrorist offences; - A duty of care obligation for all platforms to ensure they are not misused for the dissemination of terrorist content online. Depending on the risk of terrorist content being disseminated via their platforms, service providers will also be required to take proactive measures - such as the use of new tools - to better protect their platforms and their users from terrorist abuse;
- Increased cooperation: The proposal sets up a framework for strengthened co-operation between hosting service providers, Member States and Europol. Service providers and Member States will be required to designate points of contact reachable 24/7 to facilitate the follow up to removal orders and referrals;
Strong safeguards: Content providers will be able to rely on effective complaint mechanisms that all service providers will have to put in place. Where content has been removed unjustifiably, the service provider will be required to reinstate it as soon as possible. Effective judicial remedies will also be provided by national authorities and platforms and content providers will have the right to challenge a removal order. For platforms making use of automated detection tools, human oversight and verification should be in place to prevent erroneous removals; - Increased transparency and accountability: Transparency and oversight will be guaranteed with annual transparency reports required from service providers and Member States on how they tackle terrorist content as well as regular reporting on proactive measures taken;
- Strong and deterrent financial penalties: Member States will have to put in place effective, proportionate and dissuasive penalties for not complying with orders to remove online terrorist content. In the event of systematic failures to remove such content following removal orders, a service provider could face financial penalties of up to 4% of its global turnover for the last business year.
The proposal will need backing from the countries that make up the European Union as well as the European Parliament.
In response to the plans, Facebook said: "There is no place for terrorism on Facebook, and we share the goal of the European Commission to fight it, and believe that it is only through a common effort across companies, civil society and institutions that results can be achieved.
"We've made significant strides finding and removing terrorist propaganda quickly and at scale, but we know we can do more."
A spokesperson for YouTube added that the site "shared the European Commission's desire to react rapidly to terrorist content and keep violent extremism off our platforms."
"That's why we've invested heavily in people, technology and collaboration with other tech companies on these efforts."
Mozilla talked about a "troublesome" regulation. "We welcome effective and sustainable efforts to address illegal content online. But the Commission's proposal is a poor step in that direction. It would undermine due process online; compel the use of ineffective content filters; strengthen the position of a few dominant platforms while hampering European competitors; and, ultimately, violate the EU's commitment to protecting fundamental rights."