Articulating algorithmic ableism: the suppression and surveillance of disabled TikTok creators

Article / Journal
Author(s) / editor(s):
Jess Rauchberg
Year: 2025
Journal of Gender Studies, 1–12
Language(s): EnglishAbstract:
Shortly after its 2018 global launch, reports surfaced that content creation platform TikTok tasked its moderators to suppress disabled creators’ user-generated content without formal notification to the users, a belief colloquially known as shadowbanning. This theoretical article introduces algorithmic ableism to interrogate how platform systems encode ableist ideologies into algorithmic recommendation infrastructures, reproducing dominant offline beliefs. With the invocation of algorithmic ableism, the article’s analysis highlights how platform companies rely on disability-related discrimination as a platform logic that reifies long-standing western biases of who belongs in public life. Supported by scholarship in critical disability and feminist creator studies, the article engages in a critical/cultural close reading of corporate and investigative cultural artefacts, using TikTok as a case study. In doing so, the article argues how algorithmic ableism reproduces bias for disabled and marginalized creators through content suppression and surveillance. The article’s conclusion offers additional insights into how disabled creators’ microactivist content creation subverts algorithmic ableism.
https://www.tandfonline.com/doi/epdf/10.1080/09589236.2025.2477116?needAccess=true
Post created by: Lymor Wolf Goldstein