technology

People are now using AI to ‘fix’ women in another appalling way


Taylor Swift has been targeted by the dignifAI ‘movement’ (Picture: Neilson Barnard/Getty)

Just weeks after explicit, AI-generated images of Taylor Swift began circulating on X, the Grammy-winning superstar has now been targeted by the internet’s opposing band of misogynists.

They are the people behind often nameless and faceless accounts promoting the #dignifAI movement, digitally covering women’s bodies, erasing their tattoos and removing their curves.

For Taylor, that means being transformed into something like a 19th century pilgrim woman, dress buttoned up past her neck and hair encased in a blue and white bonnet.

But like deepfake porn, it isn’t just celebrities who are targeted.

The trolls are trolling the web for pictures of women they deem ‘undignified’, and using Photoshop or AI to manipulate their dress, shape and look.

4chan, well-known for its support and proliferation of misogynistic internet campaigns, such as Gamergate in 2014 and the mass leak of celebrity photos the same year, is leading the dignifAI charge.

4chan is no stranger to controversy (Picture: Jaap Arriens/NurPhoto/Getty)

In one thread on the /pol/ board, described as ‘notoriously hateful’ by 404 Media, a post reads: ‘With the power of AI, we will clothe the instahots. We will purifAI them of their tattoos, we will liberate them of their piercings.

‘We will lengthen their skirts.’

It also pledged to ‘commit unrelenting psychological warfare’, and offered a number of links to tutorials to image editing videos to help spread the practice further.

But like so many damaging and dangerous internet trends, it is not contained to one platform, with new X posts using #dignifAI appearing hourly – including from a dedicated @dignifAI account with nearly 30,000 followers.

Readers Also Like:  Unexpected Sky TV change is coming next month - here's how it will affect your viewing

The bio simply reads ‘We are starting a movement’. 

Its most recent post shows two images of Doja Cat on the Grammys red carpet. The first, on the left, is untouched. The second shows the artist with all of her tattoos removed.

Miley Cyrus has been given the same treatment, her shimmering see-through mesh dress replaced by a neck-high, knee-length rainbow shift dress.

Members of the dignifAI community were not fans of Miley Cyrus’s Grammys outfit (Picture: Jeff Kravitz/FilmMagic)

The ‘movement’ has also garnered support from right-wing commentators including Jack Posobiec and Ian Miles Cheong.

Posobiec reposted an image showing a woman wearing a bikini and holding a French bulldog puppy altered to show her holding a baby while wearing a long brown Medieval-style dress.

The Medieval look is a popular one for those in the movement, a telling insight.

Some argue #dignifAI is a deliberate wind-up, but it goes hand-in-hand with the #tradwife movement. Believers argue women and men should revert to outdated and damaging gender roles – the women cooking, cleaning, having babies, the men go to work.

But this trend is becoming increasingly popular with women as well and men and boys, with TikTokers in particular showing off their #tradlives.

And as 404media notes, the person behind the @dignifAI account has attempted to throw in a smidge of gender equality by covering up Elon Musk in his swimming trunks and removing Post Malone’s tattoos.

But at the core of the trend is, once again, a need to control women’s bodies by others.

In 2024, technology offers many more ways to achieve this – a problem that, like the heart of the issue itself, is not easy to fix.

Readers Also Like:  Something we’ve always feared about our phones turns out to be true


MORE : Artificial intelligence was taught to go rogue for a test. It couldn’t be stopped


MORE : Inside the artificial intelligence ‘X’ files taking UK military into a new age


MORE : I thought I’d get the job. Then I found out my interviewer was an AI chatbot





READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.