Deepfake technology is a threat to all women – not just celebrities

As a new UK law tackling the creation of explicit deepfakes comes into effect, GLAMOUR explores their ongoing impact.
Deepfake technology is a threat to ALL women  not just celebrities
Axelle/Bauer-Griffin

A landmark new UK law means that creating sexually explicit deepfake pornography will be made a criminal offence in England and Wales, thanks to a new amendment to the Criminal Justice Bill. This follows GLAMOUR UK's campaign to amend legislation surrounding the creation of misogynist deepfake porn. Ahead, Lucy Morgan discusses the impact of deepfake technology to both public figures and everyday women – as well as what the new law means, and the ongoing conversation on Big Tech's role in online image abuse…

It's a weekday night, and you're aimlessly scrolling through social media. In amongst the celebrity news, memes, and neverending ‘discourse’, you come across a pornographic clip. Big deal – what's new? Well, this one is extremely violent. And the woman in the video? She's got your face.

In January 2024, a variation of this scenario happened to Taylor Swift. Pornographic deepfake images of the mega-star were circulated on various social media platforms, including X and Meta. Not only were the images themselves made without the singer's consent, but they also reportedly depicted her being assaulted in non-consensual sexual acts.

She's not the only public figure to have been subject to such abuse. In May 2024, a Channel 4 News analysis found that nearly 4000 celebrities were listed on the most-visited deepfake websites.

According to NBC News, the deepfakes of Swift on X amassed over 27 million views and more than 260,000 likes in 19 hours before the account that initially posted the images was suspended. X has since blocked searches for ‘Taylor Swift’ on the site. Joe Benarroch, head of business operations at X, described the measure as “temporary”, adding that it was done with "an abundance of caution as we prioritise safety on this issue.”

A spokesperson for Meta has said: “This content violates our policies, and we’re removing it from our platforms and taking action against accounts that posted it. We’re continuing to monitor, and if we identify any additional violating content, we’ll remove it and take appropriate action.”

Swift was understood at the time (per the Mail) to be considering legal action against the deepfake website that hosted the images. However, many victims of deepfake pornography simply don't have the profile or resources to pursue justice in this manner. This doesn't only affect celebrities.

Read More
I was violated by deepfake pornography. But I won't be shamed for it

Cally Jane Beech reflects on her deepfaking ordeal (and how to prevent it from happening to anyone else)

Image may contain: Clothing, Coat, Long Sleeve, Sleeve, Blazer, Jacket, Accessories, Bag, Handbag, Adult, Person, and Jewelry

At the height of the Covid-19 pandemic, Helen Mort, a poet and lecturer from Sheffield, discovered that pornographic images, supposedly of her, were circulating on a porn site. Of course, these were deepfaked images. “It was profoundly unsettling and disturbing,” she tells GLAMOUR. “I felt unsafe.”

While the doctored images of Helen were deleted soon after she became aware of them, the deepfakes of Taylor Swift reminded her of the ordeal. “It brought back memories of my own experience,” she says. “I started thinking about the abusive images all over again in excruciating detail.”

She notes that non-consensual deepfake pornography is “probably shockingly common”. Indeed, a policy briefing on image-based sexual abuse notes that “vulnerable groups, particularly women and girls, face amplified risks and unique challenges in combatting deepfake image-based sexual abuse.”

Deepfake technology is clearly spinning out of control, and – surprise, surprise – it's women and girls who are bearing the brunt of it. Here, GLAMOUR explores the inherent misogyny within deepfake porn, the dangers it poses to all women, and whether or not it can be stopped.


What are deepfakes?

The Alliance for Universal Digital Rights defines deepfakes as: “Synthetic media that have been digitally manipulated to replace one person’s likeness convincingly with that of another. Creating deepfakes involves collecting real, everyday images of someone and manipulating them so as to create a false depiction of them doing or saying something which they have not done.”


There are myriad issues surrounding deepfakes, but by far, the most pressing is the use of this technology to create non-consensual pornographic materials. It is inherently misogynistic: a comprehensive report into deepfakes in 2023 determined that deepfake pornography constitutes 98% of all deepfake videos found online. Worse still, 99% of those targeted by deepfake pornography are women.

“Deepfake sexual abuse is commonly about trying to silence women who speak out,” says Durham University Professor of Law Clare McGlynn, a leading authority on deepfake laws. “We see this with Taylor Swift. We see this with women politicians, where deepfake porn is an attempt to intimidate them. We see it with many women in the public eye.”

Amanda Manyame, Equality Now’s Digital Rights Advisor, who works at the intersection between tech and the law, agrees that women in the public eye are at a particular risk of deepfake abuse. She tells GLAMOUR, “Anyone can be a victim of deepfake imaged-based sexual abuse, but women in the public eye and positions of authority – such as celebrities, politicians, journalists, and human rights defenders – are particularly targeted.”

But seeing high-profile women victimised in this way also has a profound impact on regular women and girls. When Ellie Wilson, an advocate for justice reform, tweeted about the troubling response to the deepfakes of Swift, she was met with her own flurry of online abuse. “People threatened to make similar deepfake images of me,” she tells GLAMOUR. “These attacks for merely stating my opinion highlight just how dangerous it is for women to simply exist on the internet.”

Olivia DeRamus, the founder and CEO of Communia, a social network created by and for women, notes that even speaking up against deepfaking puts other women in danger. “Just talking about [deepfaking] as a woman paints a target on my back, along with other advocates, the female journalists covering this, the #swifties speaking out, and even female politicians who want to tackle the issue.”

Professor Clare McGlynn emphasises that deepfaking represents a threat to all women and girls, citing the “potentially devastating impact on our private and professional lives.”

Read More
Online violence against women is rocketing – could a digital ‘safe space’ be part of the solution?

We spoke to Olivia DeRamus, founder and CEO of Communia, to learn more.

article image

It's clear that deepfake technology is rapidly hurtling out of control. Amanda Manyame cites “rapid advances in technology and connectivity” that make it “increasingly easy and cheap to create abusive deepfake content”. She adds, "Cyberspace facilitates abuse because a perpetrator doesn’t have to be in close physical proximity to a victim.

“In addition, the anonymity provided by the internet creates the perfect environment for perpetrators to cause harm while remaining anonymous and difficult to track down.”

Moreover, most countries are ill-equipped to deal with tech-facilitated harms like deepfaked image-based abuse. Until recently in the UK, it was an offence – under the Online Safety Act – to share deepfake pornographic content without consent, but it failed to cover the creation of such images. “This gap,” Manyame explains, “created an enabling environment for perpetrators who know they are unlikely to be discovered or punished. The situation is worsened by the lack of legal accountability governing the tech sector, which currently does not have to ensure safety by design at the coding or creation stage.”

Meanwhile, the tech sector itself is alienating victims. As Manyame tells GLAMOUR, “Content moderation on tech platforms relies primarily on reporting by victims, but reporting mechanisms are generally difficult to use, and many platforms frequently do not respond to requests to remove abusive content or only respond after a long time.”


What is the law on deepfakes in the UK?

Under a new law announced on 16 April 2024, those who create sexually explicit deepfakes will face prosecution – thanks to an amendment to the Criminal Justice Bill. The new offence, proposed by Conservative MP Laura Farris and the Ministry of Justice, will be punishable with an unlimited fine and a criminal record.

This a welcome update to the previous legislation, which criminalises the distribution or sharing of deepfake porn. Offenders could face prison time for sharing an explicit deepfake image online or with others.


Can anything else be done about deepfake technology? The Online Safety Act now criminalises the sharing and the creation of non-consensual deepfake pornography, which could, as Sophie Compton, co-founder of #MyImageMyChoice, a movement tackling intimate image-based abuse, tells GLAMOUR, create “greater accountability for tech companies.” Whether this legislation will be effective is another story.

She points out that search platforms drive plenty of traffic to deepfake pornography sites – can the Online Safety Act clamp down on this? “The government needs to tackle Big Tech and their role in promoting and profiting off deepfake abuse, and get the sites and web services that are profiting off of abuse blocked from the mainstream internet.”

Professor Clare McGlynn from Durham University notes that while the Online Safety Act has the potential to tackle deepfake pornography, “There is a real risk that the legislation is a damp squib, all rhetoric and little change.” She points out that Ofcom, the UK's communications regulator, is currently consulting on the guidance it will use to enforce the Act. “Ofcom needs to challenge the social media companies to make a step-change in their approaches […] It should focus on proactive regulation being human-rights-enhancing. It can enable women to live freer lives online.”

Ultimately, though, we need to address the misogynistic culture that empowers users to create harmful, non-consensual content of women. Helen Mort survived being deepfaked – yet she asks, “What are the cultural and social factors that make people abuse images in this non-consensual way?”

We're still looking for answers.


GLAMOUR has reached out to representatives for Taylor Swift and X for comment.

If you have had your intimate images shared without your consent, remember that you are not alone, and there is help available. Get in touch with the Revenge Porn Helpline at help@revengepornhelpline.org.uk. There is also a step-by-step guide on notyourporn.com, which should be followed before taking any action.

For more from Glamour UK's Lucy Morgan, follow her on Instagram @lucyalexxandra.

Read More
‘I investigated deepfake technology and found hundreds of men using it to create non-consensual porn’

One post simply requested ‘F*cked on her back please’ alongside an image of a woman fully clothed, holding a baby.

article image