Internet Policy Review Special issue | Content Moderation on Digital Platforms

Article / Journal
Author(s) / editor(s):
Internet Policy Review
Year: 2025
Abstract:
In this special issue, we refer to “content moderation” as the multi-dimensional process through which content produced by users is monitored, filtered, ordered, enhanced, monetised or deleted on social media platforms. This process encompasses a great diversity of actors who develop specific practices of content regulation. Users, non-governmental organisations (NGOs), activists, journalists, advertisers, experts, designers and researchers are becoming more and more involved in moderation-related activities, apart from, in partnership with, or against public authorities and firms. However, their precise contribution to the democratisation of content regulation, and to the balance between public and private interests in platform governance, remains little studied. Following the call to expand content moderation research beyond the relationship between states and firms (Gillespie et al., 2020), the goal of this special issue is to gather empirical studies that characterise the contribution of non-state actors in the current internet regulatory framework, and provide new insights on their various actions and strategies.
https://policyreview.info/articles/analysis/introduction-content-moderation-digital-platforms
Post created by: Lymor Wolf Goldstein