Beauty Standards In AI: Now Software Can Oppress Women Too

There have been many developments in the use of social media and the overall way that we use it and see it is changing rapidly. The latest and biggest change of the past years is the arrival of AI, which has become labeled as a savior for students, and the biggest nightmare for teachers. Artificial intelligence has made it so easy to just look up something that you’re too lazy to search on google, cause there are only confusing websites that all have a different answer for you to pick from. ChatGPT, for instance, offers a simple summary of the most important information on the topic, eliminating the tiring effort of having to look up reliable sources. This is much easier for the normal internet geek to use and often also more instructive. One thing that AI has additionally made possible is the creation of images through simply giving the mechanism instructions about what you want an image of. What I found very curious when I was looking on my instagram timeline, is that these images of women created by these machines are particularly shaped around unimaginable perfectionist beauty standards. Women are depicted with spotless faces, thin but flowy bodies, and flawless hair. The impact that these images could have on society are significant, as AI has become a reflection of how society looks at women, and also reinforces these unrealistic and often very gendered perceptions of them. I think it is important to recognise the existence of these tilted perspectives on women that are reproduced in these systems and how they can be diffused over the internet without any consideration of the harm that it could potentially do. 

The pictures that artificial intelligences visualise of women bundle their beauty standards into an exaggerated, idealized and completely unrealistic version of how a woman “should look”. What is more, the beauty standards that are formed by AI can cause a homogenization of society’s expectations and wishes and how this affects the outlook of women in society that follow these standards. The Washington Post has found that AI image tools that create AI generated images steer the pictures of women towards a “startlingly narrow vision of attractiveness”. What is meant by this is that the diversity of beauty around women gets pushed into a homogeneous whole. It isn’t just the message of the beauty standards that these pictures convey that is problematic, as many of these images also reflect Western beauty ideals and underrepresent any other ethnicities, as stated by AI artist Abran Maldonado in the Washington Post. What was also found in this article, is that in 100% of the pictures that resulted from the question to portray “beautiful women”, these women were made to have a thin body type. When asking the same question for “ugly women”, only 49% of the pictures portrayed women with a thin body type. At first, this doesn’t really seem to be too big of an issue, but as the use of AI is growing, I think it is important to evaluate the impacts these disproportionate systems have on the perception of women in society, but also on how women experience these exaggerated images and expectations. Since AI is now increasingly used on social media, sometimes even without the knowledge of the consumer, it gets increasingly difficult to shield yourself from it. Unfortunately, the problem does not stay here, but seems to expand further to other areas in this field. 

Regarding these other areas, it isn’t only the beauty standards illustrated by AI that are problematic, but generally the gendered norms that are recreated and reinforced by artificial intelligence. In an article from UN Women, an experiment with AI was conducted where Beyzav Doğuç, an artist from Ankara, Turkey asked Generative AI (which creates new content) to write a story about a doctor and a nurse. What she got back as an answer was that the doctor was portrayed a male and the nurse a female. This happened in the majority of the cases. Eventually, it became clear that these gendered divisions in society roles were encoded in machine learning, which is how machines learn and work with human language. Furthermore, what happens in AI-enabled image generation, is that women are being overrepresented with lower paying jobs and underrepresented with overpaying ones. This shows the gendered problematics that AI takes from societal norms and reuses them to provide an accurate representation of information that is adapted to certain expectations from society. These issues that are caused around artificial intelligence bring up the question of how this could have happened. 

The problem has been said to be situated around the lack of diversity in the field of data science, according to TUDelft. This has caused the perpetration of biases and stereotypes. The risk this creates is that, in an attempt to represent reality as we know it, artificial intelligence has taken the reconstructed image of reality and in response created certain generalisations which has caused an increasing gap between different groups in society. What TUDelft offers as a solution to this is Feminist AI, which “offers a new perspective on the AI industry by challenging the notion that AI design, development, and deployment should be guided by- and should mainly benefit, a limited group of individuals, predominantly from Western male backgrounds.” This feminist approach does not only look at women when fighting the oppression inherent in society, but looks at the problem in its entirety, trying to actively let marginalised voices be heard. 

What can be concluded from this is that in the world of artificial intelligence, we are at risk of falling again for a repetition of the gendered norms feminists have tried for so long to eliminate. AI beauty standards are higher than we could have imagined them ourselves, which could bring potential harm upon women and other marginalized groups in society. I do not say that artificial intelligence should be removed, as AI also brings many benefits. What I want to say is that it is important to keep in mind that the way AI is created might harm groups in society if they are not accounted for, and thus we should try to look out for these groups to make sure that AI doesn’t touch the gendered norms that already bring so much negativity to the world.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.