Many European policymakers and governments have concerns about the impact of several types of online content and user behaviour. These concerns are outlined in the UK Government White Paper on Online Harms. Policymakers worry about content that may be illegal, such as some forms of hate speech, and content that is posted with the intent to incite violence for ideological and/or religious reasons. They are also concerned about content and online behaviours which are not illegal, but which they fear may cause harm to some users. These types of content include promotion of suicide or self-harm, cyberbullying, harassment, and disinformation.
Leading social media and content-sharing platforms have stepped up efforts and dedicated more resources to restricting the availability of both illegal content and legal content that, for reasons such as the aforementioned, is considered undesirable. They have done so in part because of public pressure, and in part to improve the services and user experience they provide.
Policymakers are considering policy and legislation that will ‘hold platforms accountable’ and make them ‘take more responsibility’ for the content they host. In European countries, there is a growing sentiment that existing legislation, notably the European E-Commerce Directive (ECD), should be updated. The ECD establishes the principle that content hosts are not liable for user-uploaded content, unless they have been notified of illegality. The Directive’s provisions are general, and have been implemented differently in different Member States. The Court of Justice of the European Union (CJEU) has issued a number of rulings that clarify certain questions, but guidance as to the expectations content hosts must meet to maintain safe harbour protection remains vague. The Center for Democracy & Technology (CDT) has argued that the Directive should be supplemented with additional notice-and-action guidelines or legislation, but the Commission decided not to move forward with this type of initiative.
Now, however, the Commission is understood to be preparing policy options for new rules for content hosting; a Digital Services Act. The Act would add to several pieces of EU legislation adopted or proposed in the past few years, and to several Member State legislative initiatives focused on illegal and or harmful content, and overall regulatory supervision of content hosts.
CDT proposes some fundamental principles that should inform future EU policymaking. This input is guided by CDT’s mission to protect the fundamental rights of internet users. While the concerns behind several new policy initiatives are legitimate, CDT emphasises that policy initiatives must be very carefully crafted so as not to harm free expression, access to information, and innovation and entrepreneurship on the internet.