AI-generated faces influence gender stereotypes and racial homogenization

Share:

Article / Journal

Author(s) / editor(s):
Nouar AlDahoul , Talal Rahwan & Yasir Zaki

Year: 2025

Keywords: Artificial Intelligence, Racial homogenization, Stereotyping
Language(s): English

Abstract:
Text-to-image generative AI models such as Stable Diffusion are used daily by millions worldwide. However, the extent to which these models exhibit racial and gender stereotypes is not yet fully understood. Here, we document significant biases in Stable Diffusion across six races, two genders, 32 professions, and eight attributes. Additionally, we examine the degree to which Stable Diffusion depicts individuals of the same race as being similar to one another. This analysis reveals significant racial homogenization, e.g., depicting nearly all Middle Eastern men as bearded, brown-skinned, and wearing traditional attire. We then propose debiasing solutions that allow users to specify the desired distributions of race and gender when generating images while minimizing racial homogenization. Finally, using a preregistered survey experiment, we find evidence that being presented with inclusive AI-generated faces reduces people’s racial and gender biases, while being presented with non-inclusive ones increases such biases, regardless of whether the images are labeled as AI-generated. Taken together, our findings emphasize the need to address biases and stereotypes in text-to-image models.

https://doi.org/10.1038/s41598-025-99623-3

Post created by: Virginia Signorini

Back to overview