European Parliament Approves New Online Terror Content Rules
In yet another hit for the U.S. social networking firms, the European lawmakers on Monday backed previously proposed rules that will fine tech giants unless they fail to speedily remove terror propaganda from their sites.
The European Parliament’s civil liberties committee endorsed draft rules that would require web platforms to wipe Islamic State videos and other terror content from their services within an hour of notification for removal by national authorities. Companies could be hit with fines as high as 4 percent of annual revenue if they systematically fail to remove problematic content.
According to the rules, companies such as Facebook and Youtube will not be generally obliged to monitor the information they transmit or store, nor have to actively seek facts indicating illegal activity.
If a company has been subject to a substantial number of removal orders, the authorities may request that it implements additional specific measures (e.g. regularly reporting to the authorities, or increasing human resources).
To help smaller platforms, MEPs decided that the competent authority should contact companies that have never received a removal order to provide them with information on procedures and deadlines, at least 12 hours before issuing the first order to remove content that they are hosting.
The legislation targets any material -text, images, sound recordings or videos- that “incites or solicits the commission or contribution to the commission of terrorist offences, provides instructions for the commission of such offences or solicits the participation in activities of a terrorist group”, as well as content providing guidance on how to make and use explosives, firearms and other weapons for terrorist purposes.
A final version of the text requires furhter negotiations between the EU’s three institutions-- the parliament, the bloc’s member states and the European Commission.
The push comes amid a growing backlash against technology companies to curb illegal activity on their sites. The U.K. on Monday outlined plans for an industry-funded regulator that would police the technology companies’ platforms for harmful content, such as incitement to terrorism and child sexual exploitation.
EU officials claim that the efforts made by large tech firms to catch malicious posts are not enough.
Alphabet’s YouTube have also come under fire in recent weeks for failing to quickly remove a video live-streamed by the alleged gunman of the March 15 mosque attacks in Christchurch, New Zealand, which left 50 people dead.
Freedom of expression activists say the EU rules are part of wider effort to make tech platforms legally liable for the content users upload on their sites. They worry tech firms will be cautious in order to avoid fines and potentially remove more content than is necessary. Smaller platforms are also likely to bear the brunt of the rules given they have fewer resources than larger companies to comply with the law.
On the other hand, fighting against social media platforms could help traditional media as large TV networks and newspapers regain the control of the news.