top of page

AI pornography: an issue that transcends Taylor Swift

By Ezra Bucur

Creative Writing Editor

Via Axelle/Bauer-Griffin/FilmMagic

In January 2024, “Taylor Swift” became trending on X, formerly known as Twitter. However, this was not due to any music announcement, enjoyable tour moment, or another controversial private jet trip. As it turns out, inappropriate sexual images of the pop star had surfaced on the platform and were shared by thousands of people. One of the posts gained 45 million views and thousands of reposts and likes in the 17 hours that it lasted before being deleted. Many people were drawn to the controversy, and as a result, even more fake images were generated. The original AI-generated images were first spread on Telegram, as well as on 4chan, where they began to spread as early as January 6th. The images on 4chan were created as part of a “challenge” to make inappropriate images of popular female singers. 

The uproar was felt globally. While many were participating in the humiliating spread of the pornographic material, others were outraged. Taylor Swift’s fanbase, nicknamed the Swifties”,urged platforms to remove the images. Graphika, a research firm specializing in disinformation, began to study the origins of the image and eventually traced them back to the 4chan message boards. Social media platform X blocked any searches of Taylor Swift in an attempt to minimize the damage caused by those images. The White House got involved shortly after and sought to pass legislation preventing the creation of AI pornography. Karine Jean-Pierre, the White House Press Secretary, stated that social media companies had the responsibility to enforce their rules on nonconsensual sexual content. However, she also added that Congress would need to pass legislation and highlighted that it was women who were the primary target of this new type of revenge porn.

“AI art” and “AI artists” have been subject of controversy for just over a year, with many artists worried that major video game and film companies would opt to use artificial intelligence programs over human artists who require payment. Over the summer of 2023, the SAG-AFTRA union, which represents many Hollywood actors, went on strike. This was, amongst many reasons, due to production companies seeking to use AI to scan deceased performers without the consent of either their estate or the union. This would make it so that actors would not even own their own image, nor have control over its usage, decades following their passing. This is not an entirely new concept: the creation of a deepfake, which began to gain notoriety in 2017. These deepfakes are created with AI technology, and aim to replicate the physical appearance and voice of a person in order to make them say -or do-  anything they’d want. According to Deeptrace Labs, by the year 2019, around 14 678 deepfake videos were circulating online. 96% of them were pornographic, and all of them featured women. With the power of deepfakes, people can remove or add clothes onto a person without their consent, or make them commit obscene acts they have never committed.

As stated by White House Press Secretary Karine Jean-Pierre and the research by Deeptrace Labs, women seem to be the main victim of this sextorsion. In fact, while these algorithms can manipulate the image of men as well, they are primarily trained using pictures of women. It appears that AI is just another way for women’s image to be controlled by men. Despite it being a new tool, the idea is as old as time itself. For example, the idea of adding clothes onto pictures of women takes away their agency to decide what parts of themselves they choose to show, leaving it up for the men to deliberate women’s “purity.”. Even worse, taking away women’s clothing violates their right to intimate privacy, a concept which refers to personal information such as health or sexuality. This form of “revenge porn” is extremely detrimental to the woman’s self esteem or perception of self and is an incredibly triggering experience.

Despite this problem being endemic to our society, unfortunately, Taylor Swift, a wealthy white woman, became the catalyst for governments and online platforms to truly pursue any form of action. It was mainly due to her large platform that she was able to garner this much support, yet many other victims of this sexual violence did not receive the same attention. For example, 14-year-old Mia Janin, a schoolgirl from London, committed suicide after boys in her grade edited photos of her atop a nude body.

Taylor Swift’s situation, as terrible as it was, helped sound the alarm on one of the biggest problems of the digital age, and hopefully as a result of this awareness, less and less women will fall victim to this exploitation.


Recent Posts

See All


bottom of page