A woman whose best friend made deepfake porn videos of her and shared them online is campaigning to make seeing them illegal.
In 2019, ‘Jodie’ began seeing pictures of herself popping up in dating apps, but tried to ignore it.
But they soon began to appear on Twitter alongside messages soliciting sex, before later appearing on revenge porn websites where men post pictures of their ex-partners and encourage others to troll them.
Jodie went to the police, but officers said they couldn’t do anything about it because she didn’t know who was creating or sharing the images.
Two years later, Jodie received an anonymous email in March 2021 which alerted her to a deepfaked porn website, which took normal, fully clothed pictures of her and used AI to create images and videos of her performing sexual acts.
Traumatised Jodie said the discovery was ‘the ultimate violation’, and confided about it to her family, friends, and boyfriend
She eventually worked out who it was – her own best friend, Alex Woolf – after spotting a photo shared that only he had access to.
Jodie reported it to the Met Police but the creation, distribution and solicitation of deepfake images wasn’t – and still isn’t – considered to be a crime.
In August 2021, Woolf admitted 15 charges of sending messages that were grossly offensive or of an indecent, obscene or menacing nature over a public electronic communications network.
The derogatory comments accompanied pictures he uploaded to pornographic websites of the women, including Jodie, which had been taken from social media.
None of the pictures were pornographic or indecent, but he asked users to photoshop his victims’ heads onto pornographic actresses’ bodies, which were then posted on adult websites.
Only Jodie’s images were deepfaked, while others were normal images shared alongside grossly offensive language – and it was the language, not the deepfaked pictures of Jodie, which led to his conviction.
Woolf was given a 20-week prison sentence, suspended for two years.
Despite soliciting deepfaked sexual images of Jodie, made by other users online, he was not charged for that because it is not considered to be a crime.
Jodie, 26, from Cambridgeshire, said: ‘When I saw the AI-generated pictures and videos, I was terrified.
‘There were nine or ten pictures and videos of me being what I can only describe as raped, and anally penetrated.
‘There was one with a schoolgirl’s body with my face on it, in a student-teacher relationship.
‘It felt like the whole world collapsed around me.
‘To take my photo out of context and have it used like that – I think it’s everyone’s worst nightmare.
‘It was the ultimate violation.
‘In my victim impact statement I told how it made me feel suicidal and it has made it difficult for me to trust anyone again.
‘He was cowering in the corner when he was sentenced and he couldn’t even look at me when I spoke to him.’
In April this year, it was announced a new law would be introduced to crack down on deepfake image abuse.
But then the Conservatives were voted out of government, leaving Jodie and other victims questioning the future of the planned bill.
Clare McGlynn, Professor of Law at Durham University, who supports the campaign, explained the current legal standpoint.
She said: ‘The current law only makes it illegal to distribute or threaten to distribute intimate photos or videos of someone, including deepfake images, without their consent.
‘A vital creation offence was announced in April 2024 under the previous government, which aimed to criminalise the act of making these images in the first place, though it would have only covered certain cases of creation. However, when the general election was called, that commitment fell with the Criminal Justice Bill.
‘So far, the new Labour government has not made any commitment to reintroduce a creation offence, leaving a critical loophole in place.’
Jodie feels deepfake is ‘the next iteration of violence against women and girls.’
She thinks the current situation allows ‘loopholes whereby perpetrators can get away with crimes without facing real repercussions or rehabilitation.’
She is now campaigning for harsher penalties for people who solicit and distribute deepfakes, as well as criminalising people who create them.
Reflecting on her experiences, Jodie said: ‘I’m still now full of rage. I try to channel it into raising awareness for other women.
‘There’s a misconception that because it’s online, it’s not real.
‘But for victims, knowing that people can’t tell these images are fake feels just as violating and humiliating as if they were genuine.’
In September, Jodie launched a petition for a campaign in partnership with The End Violence Against Women Coalition (EVAW), £NotYourPorn, Professor Clare McGlynn, and Glamour UK.
The petition reads: ‘For too long the government’s approach to tackling image-based abuse has been piecemeal and ineffective.
‘This crisis demands more.’
Get in touch with our news team by emailing us at webnews@metro.co.uk.
For more stories like this, check our news page.
MORE : Man creates very rude image on his GPS while trekking in the Brecon Beacons
MORE : Man who scares off doggers faces eviction by local council
MORE : Fulham can edge neighbours Brentford in Craven Cottage shootout
Get your need-to-know
latest news, feel-good stories, analysis and more
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.