Her Campus Logo Her Campus Logo
Culture

Viral Explicit AI-Generated Photos of Taylor Swift Increase Concern About Deepfake Content

Updated Published
This article is written by a student writer from the Her Campus at UCF chapter.

Sexually explicit images of Taylor Swift, deepfakes generated by artificial intelligence, circulated on X (formerly Twitter) the week of Jan. 22 and the platform finally started to take them down on Friday the 26.

Deepfakes are life-like videos or images created using AI with the use of face and audio-swapping technology. The graphic content of Swift was viewed millions of times before the platform took it down. As reported by The Washington Post, one of the most explicit posts gained over 45 million views. The source of the images can most likely be attributed to a Telegram group that is known for generating abusive content about women, according to 404 Media.

Swift’s dedicated fanbase, commonly referred to as “Swifties,” reacted long before the social media platforms did. Fans posted repeatedly with the hashtag #ProtectTaylorSwift, in an effort to counteract searches for the explicit content and replace it with real images of the singer. They criticized the platforms for allowing the explicit photos to stay trending for so long and for not taking action to remove them sooner.

X released an official statement early Friday morning about the circulation of the images, in which the platform stated that it has a “zero-tolerance policy” towards Non-Consensual Nudity (NCN) images. The statement did not mention the images of Swift specifically, but read that the platform’s teams were “actively removing all identified images and taking appropriate actions against the accounts responsible for posting them.”

The platform halted all searches of the singer on Saturday in an attempt to prevent users from accessing the remainder of the images as it tried to remove them, as stated by the Wall Street Journal. Some of the photos spread to other social networking websites like Facebook. Meta, which owns the platform, released a statement that said it strongly condemns “the content that has appeared across different internet services” and has made efforts to remove it, according to the Associated Press.

X’s statement regarding NCN images via @Safety on X

AI-generated explicit content, however, is becoming increasingly difficult to detect, as technology and artificial intelligence continue to develop, and become more prevalent in recent years. A study conducted in 2023 found that the amount of deepfake content detected online has increased by 550 percent since 2019 because of advancements in AI, according to BBC News. The regulation by social media platforms to help minimize the circulation of this content has also decreased in recent years.

This is not just a problem for celebrities. Female high school students in New Jersey and Washington State were recently also victims of deepfake pornography, according to NPR. The Princeton Legal Journal reported that “90-95% of deepfake videos are nonconsensual pornographic videos and, of those videos, 90% target womenā€”mostly underage.”

There is no existing legislation that helps prevent the creation and spread of this type of fake explicit content. However, U.S. Rep. Joseph Morelle of New York has proposed the Preventing Deepfakes of Intimate Images Act, a bill that would make creating and sharing this type of content online a federal crime.

“The images may be fake, but their impacts are very real,” Morelle said in a statement reported by the Associated Press. “Deepfakes are happening every day to women everywhere in our increasingly digital world, and it’s time to put a stop to them.”

Kendal is a junior at the University of Central Florida, majoring in journalism, with an additional degree in English Literature and a certificate in Editing and Publishing. She is Editor-in-Chief and one of the Co-Campus Correspondent of Her Campus UCF. She has previously been a staff writer and associate editor for her chapter. Kendal has a passion for writing and reporting, her favorite coverage areas being breaking, political, and environmental news. She also loves to write about art and music (along with anything relating to love).