11 september 2023

How the EU's Digital Services Act poses risks to fundamental rights and democracy

(Published on Brussels Report, 11 September 2023.)

The Digital Services Act (DSA) aims to provide a European solution to an important societal challenge: illegal content and misinformation online. These are real and significant problems, for which new legislation was needed indeed. The DSA was developed and approved during the pandemic. It is a lengthy and complex text, and I believe it received insufficient political and public scrutiny.

The DSA does contain highly valuable provisions, e.g. providing mechanisms for fighting child abuse and for enforcing transparency requirements in online advertising and recommendation systems. However, there are also concerning parts, which may undermine fundamental rights, especially freedom of expression and information, a cornerstone of European democracies and enshrined as Article 11 of the EU Charter of Fundamental Rights. In this opinion piece, I summarize my understanding of some key points in the DSA, omitting technical details and nuances, and emphasizing those potential risks.

-        The DSA requires that online platforms make it easy for anyone to report allegedly illegal content. Online platforms (and providers of so-called ‘hosting services’ more generally), such as social media platforms, must provide user-friendly mechanisms that allow anyone to ‘notify’ (i.e. flag or report) content on the platform (e.g., a message on X) that they believe to be illegal (Art 16).

-        Online platforms become liable for reported content. Once notified, platforms (e.g., X) become liable for the notified content, if it is indeed illegal (Art 6, and Art 16 paragraph 3). They then must quickly take appropriate action against it (suppression, removal of the content, suspension or blocking of the author...) (Art 16).

-        Notices submitted by so-called Trusted Flaggers must be handled with priority. Trusted Flaggers are organizations that will be certified as such by the so-called national Digital Service Coordinators (DSCs) (see below and Art 49-51). Platforms must handle notices submitted by Trusted Flaggers as a matter of priority (Art 22). Trusted Flaggers who too often notify content inaccurately can lose their privileged status, but risk nothing more than that (Art 22, par 6-7).

-        Complaint and redress procedures must be established, but they will take time. Complaint and redress procedures must be established and clearly communicated (see e.g. Art 16 par 5, and Art 17 par 3(f)). Such procedures must include an internal complaint-handling system (Art 20), out-of-court dispute settlement bodies (Art 21), and judicial redress. However, these procedures can drag on for months, during which time the measure (suppression, removal, etc.) remains in force.

-        Terms & Conditions (T&Cs) are used as a means against undesirable (but not illegal) content. Notifications can compel a platform to take measures for two possible reasons (e.g., Art 17 par 1):

1.     because the platform also deems the content to be illegal (or, perhaps more often, because it does not have the time or resources to investigate, and thus prefers to err on the side of caution), or

2.     because the content violates the platform's T&Cs (but is not necessarily illegal).

Remarkably, the DSA explicitly highlights the possibility for online platforms to rely on Trusted Flaggers or similar mechanisms for policing content that is incompatible with their T&Cs (e.g., Recital 62). This contrasts with the stated role of Trusted Flaggers as flagging illegal content (Art 22).

-        Mandatory risk assessments and mitigating measures allow the European Commission to influence the T&Cs, content moderation policies, and more, of very large online platforms. Very large online platforms and search engines (X, Facebook, YouTube, Instagram, Google, Snapchat, TikTok, etc.) are required to regularly perform analyses of “systemic risks” (Art 34) and take mitigating measures (Art 35). This may include tightening and extending their T&Cs, strengthening their cooperation with Trusted Flaggers, altering their algorithms, etc. Such risk assessments explicitly go beyond illegal content: they aim to address allegedly deceptive content and disinformation (see also Recital 84). As such, the DSA grants the European Commission not only the power to suppress illegal content, but also content they consider undesirable because it is allegedly detrimental to ‘civic discourse’, public security, public health, and more.

-        A crisis response mechanism provides an additional mechanism for the European Commission to exert control over non-illegal content. In times of ‘crisis’ (proclaimed by the European Commission itself, on the initiative of the new European Board for Digital Services), such as a threat to security or public health, the European Commission can demand additional measures from the very large platforms and search engines (Art 36 and 48, and Recital 91). These measures, which may include adjusting the T&Cs and the content moderation processes, intensifying cooperation with Trusted Flaggers, etc., must then be taken by these platforms as a matter of urgency.

-        Codes of Conduct provide a further mechanism for the European Commission to exert control over non-illegal content. The European Commission will ‘encourage’ the drafting and adherence to ‘voluntary’ Codes of Conduct for the sector (including quantifiable KPIs and the obligation to report regularly to the Commission and the relevant Digital Service Coordinator). The purpose is not only to combat illegal content, but also the so-called ‘systemic risks’ (Art 45), with specific references to ‘disinformation’ (Recital 104). The phrasing of Art 45 and its motivation (e.g. Recitals 103 and 104) almost read like “an offer they can't refuse”.

-        Transparency is limited. There are various provisions concerning transparency (e.g., Art 15, 17, 22 par 3, 24, and 42, and Recital 122), but these are mostly at an aggregated level, rather than at the level of individual content and content moderations. This makes independent scrutiny of the impact on freedom of expression and information very difficult.
DSCs will have the authority to grant a ‘vetted researcher’ status to particular researchers. Very large online platforms and search engines can then be compelled to provide these vetted researchers with access to data for the purpose of investigating systemic risks in the European Union, and for assessing the adequacy, efficiency and impacts of the risk mitigation measures. Unfortunately, the ‘technical conditions’ (which data, which permitted purposes, etc) are yet to be determined, and the power to do this is delegated to the European Commission (Art 40 par 13).

-        Fines for non-compliance are exorbitant, but not for transgressions of fundamental rights. Fines for non-compliance with the obligations under the DSA are exorbitant: up to 6% of the global annual turnover (Art 52 and 74). The DSA also does pay occasional lip service to fundamental rights, including freedom of expression and information (e.g., Art 35 par 3 and Recitals 86 and especially 153). However, I could not find anything about fines for overzealous ‘content moderation’ that violates these fundamental rights. It all remains very vague, without concrete guarantees. The DSA asserts one’s right to file a complaint (Art 53), and to request compensation in case of loss or damage resulting from non-compliance with the DSA (Art 54). But it is doubtful whether a violation of fundamental rights (especially freedom of expression and information) will be recognized as non-compliance with the DSA.

It is hard to shake the feeling that it was a mistake to write and approve this Act between 2020 and 2022, during the pandemic when authorities were struggling to control the global narrative. In my opinion, the European regulator went grocery shopping on an empty stomach, which is never wise.

It is now up to the members states to implement the DSA in such a way that the risks to our fundamental rights and to the foundations of our European democracies remain as small as possible.


Geen opmerkingen:

Een reactie posten