Transparent, Rapid and Contextualised? Comparing Content Moderation Requirements in Emerging EU Regulation

Authors

  • Therese Enarsson Umeå University

Abstract

The European Union is making unprecedented efforts to address illegal and harmful online content through regulation, particularly targeting very large online platforms (VLOPs). This article examines how recent EU instruments – starting with the Digital Services Act (DSA) – assign responsibility for efficient and rapid content moderation while allowing for contextualised decision-making. This has implications for both automated and human moderation. The analysis shows that advanced moderation is effectively mandated, requiring a mix of automation and human oversight, though the explicit need for human contextual input is rarely stated. Instead, VLOPs are primarily tasked with ensuring transparency in moderation processes and decisions. As a result, responsibility is partly shifted to users, who must take certain actions, such as requesting human review of automated decisions. The article also highlights the complex interaction between the DSA and other specialised regulations and self-regulatory measures on hate speech, terrorism-related content and disinformation – creating a dense regulatory landscape that warrants further study. 

Downloads

Published

30.09.2025

Issue

Section

Refereed Articles