On 15 December, the European Commission proposed the long-awaited Digital Services Act – a key part of the plan to create a Europe that is ‘fit for the digital age’. The Digital Services Act (DSA) will endeavour to provide much needed clarity to the rules on platform responsibility, given the legal ambiguities coming from the 20-year-old e-Commerce Directive. The digital landscape has changed dramatically over the past two decades and the advent of online platforms such as Facebook, Twitter, Instagram, YouTube and TikTok, and the infrastructure they utilise, has created new risks that arguably undermine user safety and result in an uneven playing field.
The European Commission has already addressed sector specific issues with the protection of copyright and the dissemination of terrorist content (e.g. the Audiovisual Media Service Directive and the provisional agreement on the Regulation to Remove Terrorist Content Online), but the DSA will not alter their application. Instead, the DSA will require all digital service providers, from social media companies to online marketplaces and platforms, to take responsibility for the removal of illegal content. Through an asymmetric approach with a sliding scale of due diligence obligations for very large platforms, other online platforms, hosting services and all intermediaries, the Commission hopes to provide the right conditions for European companies to grow. Such a change will help deliver a modern Digital Single Market and allow Europe to help assert its digital sovereignty.
The DSA is part of a wider initiative to impose ex ante rules on large platforms that act as gatekeepers between business users and their customers. Together with the Digital Services Act the Commission has proposed a Digital Markets Act (DMA) that seeks to rein in the dominance of very large platforms. The European Commission’s Executive Vice President for A Europe Fit for the Digital Age, Margrethe Vestager, considers that “the two proposals serve one purpose: to make sure that we, as users, have access to a wide choice of safe products and services online. And that businesses operating in Europe can freely and fairly compete online as they do offline”.
Legacy of the e-Commerce Directive
The e-Commerce Directive established the internal market clause (commonly known as the ‘country of origin’ principle) that subjects information society providers to comply only with the rules of the member state where they are established; the prohibition on general monitoring obligations for online intermediaries; and the limited liability clause for online intermediaries. The Directive provides a safe-harbour from liability for companies that were mere conduits, cashing or hosting services that hosted illegal content posted by users. However, the language of these provisions was ambiguous, resulting in fragmentation across Europe that undermined legal clarity for businesses and consumers alike.
The DSA retains and clarifies these three core principles and introduces a Good Samaritan Clause to deliver a pan-European safe harbour provision that provides a liability exemption for intermediary services that undertake voluntary automated or human measures to detect and remove illegal content. The DSA proposes an asymmetric due diligence obligation and a common framework for enforcement.
In regulating platform responsibility, the DSA proposal sets out rules for the handling of illegal or harmful content online; the liability of online providers for third party content; vetting obligations of third party suppliers’; and the protection of the fundamental rights of users online. The obligations depend on the characteristics, size and social impact of the digital service provider.
Content moderation: Hosting services must implement user-friendly notice-and-action mechanisms for users to flag content; prioritise notices from ‘trusted flaggers’; issue an explanation of the content removal decision to the user; publish transparency reports on the removal or disabling of content that is illegal or is contrary to its terms and conditions; and to help settle disputes with users through an out-of-court dispute settlement process for contested moderation decisions.
Advertising transparency: Online platforms need to inform users, in real time per advert and user, that they are seeing an advert, the legal person on whose behalf the advert is displayed; and information as to why a specific user was targeted.
Transparency in distance contracts: To help identify the sellers of illegal goods of services, online platforms that allow distance contracts between consumers and traders will be subject to ‘know-your-business customer’ procedures.
Enforcement: National Digital Services Coordinators (DSCs) will oversee enforcement and will have the power to investigate, fine and impose restrictions on platforms, as well as coordinate with other DSCs to conduct cross-border investigations. An independent advisory group, the European Board for Digital Services will issue guidance and help ensure consistency. The European Commission has supervisory and enforcement powers for very large platforms that have over 45 million monthly users.
Penalties: Serious instances of non-compliance with the DSA may result in fines of up to 6% of the annual turnover of the service provider or platform. The failure to provide information or to submit to an inspection can result in a fine of 1% of annual turnover.
Impact for businesses
The sliding scale of due diligence obligations will affect very large platforms, online platforms, hosting services and all intermediaries to varying degrees but there will be cost and complexity to be borne by all. Nevertheless, businesses will welcome the clarity on intermediary liability, as well as the retention of the core e-Commerce principles. For companies based outside the EU, they will need to appoint a legal representative in a member state of the European Union. All companies subject to the DSA will be acutely aware of the increased transparency obligations that will grant regulators greater access to their data. The imposition of fines could result in some digital service providers taking a harsher stance on hosting dubious content, which may result in legal costs in defence of its position, as well as reputational damage.
The European Parliament and the European Council will need to reach an agreement on the text of the DSA. Margrethe Vestager confirmed that she is hopeful that the DSA (and DMA) will be written into Union law by mid-2022. This tallies with the French who have indicated that they would like to conclude the files during its Presidency of the Council of the European Union in the first half of 2022.
Before then the European Parliament and Council will have to reach their own positions before entering into negotiations on the final text. Portugal, who take over the Presidency of the Council on 1 January 2021, are considering discussing the DSA within the Internal Market Working Party and the DMA within the Competition Council formation.
Meanwhile, the European Parliament has not yet selected the lead committees that will amend the Commission’s proposal. In pre-empting the Commission proposal, the Parliament’s Internal Market, Civil Liberties and Legal Affairs Committees published three non-binding reports containing a wish-list for the DSA. It remains to be seen which committee will lead the processes, but it may be that the Economic and Monetary Affairs Committee takes over the DMA.
DSA as a model
All going well, as with the EU General Data Protection Regulation (GDPR), the Commission is hopeful that the DSA will become a global blueprint for regulating digital services. On 15 December the United Kingdom also published its long-awaited Online Harms proposal, which is similar to the Commission’s proposal. Meanwhile, the United States is still considering changes to its own liability regime housed in Section 230 of the Communications Decency Act.
We follow and engage on platform rules and technology policy on behalf of our clients. If you are interested in knowing more we can be contacted at email@example.com