[ad_1]
A deeply disturbing technological trend is going viral: AI-powered mobile apps that digitally remove women’s clothing without consent to create fake nudes.
According to recent findings, more than 30 of these so-called ‘undressing platforms’ attracted a whopping 24 million visitors in September last year alone, marking a worrying rise in popularity. The apps use artificial intelligence to alter clothed images of women and instead show them naked – often without any consent.
Many such services explicitly target women and objectify female bodies by making them available for online voyeurism, without regard to personal boundaries. Experts classify these AI-generated fake nudes as forms of non-consensual pornography, which have lasting psychological effects and enable further abuse.
“It’s incredibly dehumanizing and traumatic to know that your body can be virtually violated in this way,” says Rachel Brown, a counselor who works with victims of revenge porn. “The feeling of powerlessness and fear that this evokes in women cannot be underestimated.”
Despite public calls for accountability, the apps continue to proliferate in what advocates call a “digital rape culture,” made possible by advancing technology. While some social platforms have banned associated search terms, researchers emphasize that stronger steps are urgently needed to curb this troubling privacy violation — both by tech companies and lawmakers.
“Adjustments alone are not enough; these apps require radical action that sends an unequivocal message rejecting toxic technology,” said nonprofit leader Sienna Watson. “If not, many more women will become unwilling participants in a database of AI-generated nudes, silently suffering trauma.”