Pawns in the Deepfake Game
âAI assisted fake porn is here and weâre all f*ckedâ says Samantha Cole. For the most part, sheâs right. Deepfake or AI assisted porn uses known or ârealâ faces edited onto computer-generated bodies in sexually explicit content. Their growing popularity signals the advent of the newest, shiny weapon in the war of disinformation. I would argue against Cole on one point however: we arenât all doomed. In researching this perverse form of âentertainmentâ the gendered nature of Deepfake porn has become increasingly obvious and, as with more traditional porn industries, it is clear that women are being disproportionately exploited. Female celebrities have long since been the victim of attempts to expose their privacy through attacks such as nude photo leaks, however, technological developments in the realm of Deepfakes has added extremely realistic and degrading videos to the mix.
In 2017 a reddit user: Deepfakes began devoting their platform to the creation of Deepfake porn videos of famous female actors. Maisie Williams, Taylor Swift, Aubrey Plaza, Gal Gadot and Scarlett Johansson have all been made victims of Deepfake porn. These videos are obviously extremely distressing for those that they are made of, in the case of Gadot, however, her non-consensual porno was made even more disturbing through its depiction of her participating in incestual sex with her step-brother. When you look at the list of those affected by Deepfake porn the sexism is painfully clear. Nina Schick, an expert in the field of disinformation, has stated of the hundreds of videos she found of celebrities, there was âno Brad Pitt, George Clooney, or Johnny Deppâ, they were all women. Despite the trauma that comes with this kind of identity manipulation, celebrity actresses do have one saving grace. As outlined by Johansson, due to their standing in public life, people rarely believe it is actually them starring in a porno.
Unfortunately, Deepfake porn is not just targeted at female actors and women in other careers do not have this privilege. Rana Ayyubâs experience is proof of this. Ayyub is a journalist who was due to appear on the news to discuss the controversy surrounding the rape of an eight year old Kashmiri girl by a Hindu man, when a Deepfake porno of her was made public, seemingly in an attempt to undermine her politically. The video of Ayyub was circulated on WhatsApp which, due to its end-to-end encryption, is particularly difficult to trace. It was then shared on to a Bharatiya Janata Party fan page where it went viral. Ayyub was unable to get sufficient help from the police, and it was not until the UN intervened that proper attention was paid to her case. Regardless of legal help, the emotional effect of this event is clear: Ayyub has stated âI used to be very opinionated, now Iâm much more cautious about what I post online.â
Googling your own name is fairly high up on the list of aimless narcissistic pursuits weâve all undertaken at one stage or another. Sometimes we do it to ensure that future employers wouldnât find anything untoward if they were to do the same, but mostly itâs just blind curiosity. The results are never that exciting either, usually a myriad of old social media photos, perhaps a few embarrassing pictures from school events, nothing noteworthy. For Noelle Martin, this was not the case. She describes how, upon searching an image of herself on social media her screen âwas flooded with that image and dozens more images of [her] that had been stolen from [her] social media, on links connected to porn sitesâ. At the time Martin found that in Australia, where she is from, there was no legislation in place to prevent or to persecute against the circulation of non-consensual synthetic images or videos. Due to this, Martin attempted to approach those posting videos of her, herself requesting that they take them down. All of them refused, with one webmaster even demanding nude photos in exchange for the removal of the video. After a great deal of activism Martin was successful in having new legislation enacted in 2019, however, it only applies in Australia.
The process of having these videos removed is extremely complicated, globally the legal protocols differ massively, so, while the video can be censored from view in one place, people in other countries can still gain access to them. Johansson stated, âI think itâs a useless pursuit, legally, mostly because the internet is a vast wormhole of darkness that eats itself.â This outlook, whilst depressing, is easily understood in the face of such an uncontrollable spread of disinformation. Furthermore, judging how to raise awareness about Deepfake porn is extremely difficult. I have found in researching this article, dodging the actual videos can be extremely challenging. Ayyub stated âI didnât speak about it for a long time because I worried the larger audience would not empathise or sympathise with me but they would want to explore it more. I didnât want Deepfake to get that kind of popularity.â
In addition to this the programmes needed to create these videos are publicly accessible, they are being made by reddit users not by special effects teams. While the sophistication of the videos is increasing at a terrifyingly rapid rate, the technology to detect Deepfakes is comparatively still in the Dark Ages. With the increasing pervasiveness of ârevenge pornâ the real danger that Deepfake porn, and its accessibility, poses will no doubt become harshly apparent. The terrifying fact that Martinâs case makes clear is that we, the general public, are equally at risk of having Deepfake pornos made of ourselves. Schick has argued that âit is no exaggeration to say if you have ever been recorded at any time in any form of audio-visual documentation, be that a photograph, a video or an audio recording, then you could theoretically be the victim of Deepfake fraud.â Victims of Deepfake porn videos argue that the knowledge that these videos are fake does little to comfort them, the feelings of violation that come with this kind of rape on privacy are so prevalent that for all intents and purposes, they are real.
This Deepfake pandemic is highlighting that once again women are being used as pawns: Schickâs dystopian warning âDeepfakes are coming, and we are not readyâ clearly needs to be heeded.
Â