The Disturbing Growth of Apps That Virtually Undress Women Without Consent


A new breed of artificial intelligence-powered mobile applications that digitally alter photos to make women appear naked are growing rapidly, raising serious ethical and legal concerns around consent, privacy and online harassment.

According to recent findings, a network of more than thirty so-called ‘undressing apps’ and websites attracted more than 24 million visitors in September last year alone, marking a sharp increase in popularity compared to previous years. Referral links promoting these fake nude services have also increased by more than 2,400% on platforms like Reddit and X since the start of 2023, signaling the start of a disturbing new trend.

These applications use AI systems to digitally strip clothing from images of women to create realistic-looking nude photos, often without the consent or even knowledge of the women depicted. Many services explicitly target and attempt to monetize images of women in particular, with some users even offering bounties for photo submissions.

The convincing fake nudes constitute so-called ‘deepfakes’ – AI-manipulated media that experts warn is increasingly being weaponized for abuse. Lawmakers currently classify such non-consensual manipulated private content as forms of revenge pornography or non-consensual fake pornography, which carries personal and psychological consequences for victims.

Worryingly, reports indicate that in some cases underage girls are also being targeted by these apps. Yet there are currently no specific federal laws banning these exploitative and abusive deepfake practices in the United States. While some welcome emerging technology, experts are almost unanimous that there is an urgent need for regulation and oversight around AI tools that enable harassment, extortion, and violations of women’s consent and privacy.

“What we’re seeing is a familiar story: new technologies are being used to further exploit women and girls,” said Dr. Sienna Watson, a leading digital harassment researcher at the nonprofit Protect Women Online. “Even as the law leads innovation, tech companies have an ethical obligation to prevent their services from enabling abuse.”

As the use of invasive AI apps continues to fall further out of step with policy, women’s rights advocates emphasize that law enforcement and lawmakers must catch up to curb these disturbing trends before further harm spreads online.

🌟 Do you have burning questions about Undress AI apps? Do you need some extra help with AI tools or something else?

💡 Feel free to send an email to Govind, our expert at OpenAIMaster. Send your questions to and Govind will be happy to help you!

Leave a Comment