In Europe, policy-makers’ attitudes are hardening towards a policy of benign neglect that has characterised the last 20 years of technological expansion. Instead, it is now routine to see promises to crack down on the “new wild west”. The prime target: revisiting intermediary liability safe harbours in 2019.
Policy-makers face difficult trade-offs when trying to regulate tech companies. The current environment has fostered growth and innovation, but the pace of technological change has made it difficult for countries seeking to contain bad actors proliferating on online platforms.
Online platforms in the EU have been protected since the turn of the millennium by versions of the eCommerce Directive which categorises them as being “mere technical, automatic, passive” intermediaries. Under this law, online intermediaries are protected from liability of the dissemination of harmful content through safe harbours and only lose such protection when failing to remove it after receiving notice of it.
Today, one does not have to look far to see these provisions under sustained criticism. Even the platforms themselves are questioning the responsibility of online platforms in the dissemination of harmful content, ranging from misinformation — or ‘fake news’ — and cyberbullying to the spread of terrorist content. Following the Cambridge Analytica scandal, Facebook’s Mark Zuckerberg expressed willingness to work with legislators to develop the “right regulation”.
In the EU, previous steps taken to limit harmful content include the directives for child protection and combatting terrorism, as well as non-legislative measures like the Code of Conduct on Countering Illegal Hate Speech or the European Strategy for a Better Internet for Children. These liabilities are also considered the major obstacle to those seeking to contain pirated content, although efforts to make platforms license copyright content are stalling in EU negotiations.
With providers now curating content with algorithms, many fail to see how this editorial control differentiates them from publishers. While there is no proposed legislation, a 2017 communication from the European Commission recommended online platforms actively monitor, identify, and remove harmful content. While not legally binding, the stance taken by the EU demonstrates an inclination to establish new obligations and place new responsibilities on online platforms. It remains to be seen whether this communication will lead to legislation.
Even in the UK, traditionally soft on regulation, attitudes have hardened just as the point where it can depart from the EU rules comes into view with Brexit. Initiatives, starting with the Digital Economy Act 2017 which in part aimed itself at protecting children online, have piled up. An Internet Safety Strategy, due to tackle a full range of online harms from cyber-bullying to child exploitation, is due early this year. Many of these measures will certainly raise the cost of maintaining these safe harbours, if not entirely remove them, but the direction of travel was made clear by the election manifesto of the Conservatives in 2017 calling on the UK to become “the global leader in regulation of the use of personal data and the Internet”.
In 2019, there is every reason to expect the debate to be re-opened. Not least, there will be a new European Commission in place under pressure to re-open them. Compounding this, revelations about the harms propagated on online platforms do not appear to be going away and, be it election tampering, cyber-bullying or hate speech, legislators are poised to step in to what they see as the common denominator. The risk in Europe, as it is elsewhere, is that these regimes fragment and contradict each other, while each claims the right to enforce their views outside their borders.
The question of platform regulation challenges liberal societies by magnifying the tensions inherent in their set of liberties: freedom of speech, freedom from discrimination, the right to exploit one’s intellectual property, the right to anonymity and the necessity of identity to enforcing laws. In a continent with important cultural differences defining the relative importance of these rights, online platforms must come prepared with answers on responsibilities they can accept.
Authors:
Mike Laughton, Policy Analyst, Access Partnership
Héloïse Martorell, Content Manager, Access Partnership