Taylor Swift and the Dangers of Deepfake Pornography | National Sexual Violence Resource Center (NSVRC) Skip to main content
Get Help Escape
English Spanish

Taylor Swift and the Dangers of Deepfake Pornography

Black keyboard with light up yellow keys

Disclaimer: We use "Deepfake pornography" as a term to describe a form of image-based sexual abuse. Not all forms of pornography are consensually made, and all non-consensually made/distributed pornography is a sexual violation that has no place on the internet.

 

Taylor Swift is one of the most famous and influential women on the planet. Her music has reached many people around the globe, particularly girls and women. Her high-profile status has recently opened up a much-needed public conversation about about how AI and image-based sexual abuse disproportionately impacts and harms women. It’s likely you are aware of the mass image-based abuse campaign against her in the form of the creation and distribution of non-consensual deepfake pornography.

On January 24, 2024, users created sexually explicit images depicting Swift at a football game. On X (formerly known as Twitter) these images spread like wildfire, despite very clearly violating the platform’s policies. Those AI-generated pictures gained over 45 million views, along with hundreds of thousands of likes, bookmarks, and reposts over a 17-hour period before they were taken down. This is not the first time Swift has been victim to false sexual content of herself being created and distributed without her consent. In her early twenties, a celebrity gossip website published a fake topless photo of her. Years later, Ye (formerly known as Kanye West) had a nude wax model of her created for one of his music videos. That wax model, along with nude models of other public figures that he had made for that video without their consent, were later turned into an art exhibit. One Swift fan correctly noted that, seven years after it dropped, this very video which Swift has since referred to as “revenge porn” is still readily available on YouTube. All of these instances are sexual violations and speak to the longstanding sexual dehumanization of women in the public eye. With the current ease of access to artificial intelligence (AI), images like the ones of Swift that recently went viral demonstrate how pervasive of an issue false sexual imagery is, how many users online will readily disregard the importance of consent, and how difficult it is even for the world's most famous people to get these images removed. 

A “deepfake'' is a fake image or video created through “deep learning”, a process by which a machine learns how to produce its own image, audio, or video of a subject based on the examples of that subject users have submitted to it. For example, if it’s fed enough images of a human subject, it can create convincing images or videos of that person in fake scenarios or performing actions that never happened. While deepfake software wasn’t designed with the explicit intent of creating sexual imagery and video, it has become its most common use today. According to an AI-developed synthetic media monitoring company, Sensity, “96% of deepfakes are sexually explicit and feature women who didn’t consent to the creation of the content.” Many of these visuals are altered images of underage girls. These deepfakes all fall under the umbrella of Non-Consensual Intimate Imagery (NCII). With over 9,500 sites dedicated to promoting and profiting off of such violations (in addition to the abuses we can still find on mainstream social media platforms), we all need to understand this problem, its scope, and its potential consequences. 

The Problem

Deepfake pornography isn’t limited to people of any specific background, occupation, or age. Whether it’s a middle-schooler bullied by her peers, an ASMR (Autonomous Sensory Meridian Response) YouTuber, a journalist covering a child sex abuse story, a mental health expert speaking before a parole board, or a world famous pop star, all are vulnerable to this type of attack and should have their right to privacy and autonomy respected. 

On the internet, there is no shortage of consensually made and distributed pornographic material, yet deepfake pornography is becoming ever-increasingly popular due to misogyny.

Vice Senior Staff Writer Samantha Cole remarked the following about the sexist motivations behind deepfake pornography creation: 

 

“In these online spaces, men’s sense of entitlement over women’s bodies tends to go entirely unchecked. Users feed off one another to create a sense that they are the kings of the universe, that they answer to no one. This logic is how you get incels and pickup artists, and it’s how you get deepfakes: a group of men who see no harm in treating women as mere images, and view making and spreading algorithmically weaponized revenge porn as a hobby as innocent and timeless as trading baseball cards...we must first acknowledge that the technology that could start a nuclear war was born as a way for men to have their full, fantastical way with women’s bodies.”

 

It is not a healthy expression of sexuality, an appropriate approach to humor, or an ethical use of another’s image. The creation and sharing of these deepfakes are demoralizing, degrading, and dehumanizing acts, and normalizing such behavior has real world consequences.

The Impact

Many victims do not know who created these images and videos, and feel isolated, disconnected, and mistrustful of many people around them. They are likely to experience poor mental health symptoms like depression and anxiety. They may experience reputational damage, compounding this emotional distress. They withdraw from areas of their public life, whether it’s online or in-person. They potentially lose jobs and job prospects. All of this falls under the “silencing effect,” a term coined by Amnesty International to expose how victims are effectively silenced due to the lasting ramifications of online gendered abuse. The victims live with this public humiliation and its resulting fallout while its perpetrators thrive in secret.

As the abuse of AI has become more acknowledged and discussed, so has the outcry against deepfake pornography. Last year, a documentary called Another Body was released, which dove into the experiences one college student had upon discovering that deepfake NCII was created of her and her peers. The creators of this documentary created #MyImageMyChoice, a campaign calling for both tech companies and governments to confront deepfake pornography and other forms of online gendered abuse head on.

Legal Landscape

Currently, there is little legal recourse for victims of deepfake pornography. While a few states have made a legal precedent against deepfake pornography, most have not, and there is no federal law prohibiting it in the United States. With the international scope of the internet, the inconsistent regulation between states and countries puts a barrier against victims who wish to report and pursue justice in the legal system. 

In light of Swift’s viral pictures, there has been some revitalized discussion about creating US federal laws surrounding deepfake pornography. While some responses can come from governments and tech companies to address this issue, there are preventative tools, resources, and practices everyone can use.

Tips

  • Report all content that sexually humiliates and degrades other users.
  • For images/videos depicting minors, use a fingerprinting service developed by the National Center for Missing and Exploited Children called Take It Down.
  • If you own explicit images, encrypt them with a similar digital footprint using the Stop NCII tool so that participating companies can remove it from their website.
  • Shift culture and change norms in your social media circle to encourage healthy discourse, and discourage the amplification of bad social media behavior. See New Public’s recommendations for changing social media practices at large.
  • Promote and use materials from our previous Sexual Assault Awareness Month theme, “Building Safe Online Spaces Together”. Our resources about Practicing Digital Consent, Taking Action to Intervene, and Know[ing] the Facts may be valuable tools to educate and empower your social circle about what they can do to create safe online spaces going forward.

The future of deepfake pornography, and how it will impact adults and children, is going to be an ever-evolving discussion. However, it is important that we continue to discuss. It is our collective role to uphold the right to privacy and protection against all forms of NCII. Whether it’s a pop star or one of our peers, this is an issue we simply cannot shake off.
 

Resource Topics