1. Home
  2. Technology
  3. How a “Nudify” Website Turned Friends Into Unexpected Leaders in the Battle Against AI-Generated Porn
How a “Nudify” Website Turned Friends Into Unexpected Leaders in the Battle Against AI-Generated Porn
0

How a “Nudify” Website Turned Friends Into Unexpected Leaders in the Battle Against AI-Generated Porn

5
0

In June 2024, technology consultant Jessica Guistolise received a late-night call that would alter her life. An acquaintance, Jenny, revealed disturbing evidence: her estranged husband had used an AI-powered platform called DeepSwap, a so-called “nudify” site, to generate deepfake pornography of more than 80 women in Minneapolis — all created by manipulating personal social media photos.

What Guistolise saw upon her return left her shaken. Pictures from her family vacation and even her goddaughter’s graduation had been turned into explicit fake content without her consent. “The first time I saw the images, something inside me shifted forever,” she recalled.

DeepSwap, which charges subscription fees for “premium” features, is one of dozens of nudify services that have surfaced since the explosion of generative AI. These apps allow anyone, regardless of technical skill, to produce realistic sexual deepfakes in minutes — a growing problem experts warn is spreading globally.

For Guistolise and her circle of friends, the discovery launched them into an uncharted legal and emotional battle. Victims quickly realized existing laws offered little protection: the manipulated images were never distributed, making prosecution nearly impossible. “He didn’t technically break any laws we know of, and that’s the problem,” said law student Molly Kelley, another victim who has since become an advocate for stronger AI legislation.

Their efforts eventually reached lawmakers in Minnesota, pushing for a bill that would fine companies $500,000 for every nonconsensual explicit deepfake created in the state. Advocates argue such action is long overdue, as victims often endure psychological trauma, paranoia, and even health complications caused by stress.

Globally, the crisis is escalating. In 2024, an Australian man was sentenced to nine years in prison for creating deepfake porn of 26 women, while incidents in schools underscored the technology’s easy accessibility. Despite tech giants pledging crackdowns, researchers continue to uncover nudify ads and apps across major platforms.

“This is not an issue that will fix itself,” Jenny emphasized. “We need accountability — not just for individuals abusing this technology, but for the companies enabling it.”

For Guistolise, Kelley, and others, the mission is clear: raising awareness and pushing for stronger laws to stop the weaponization of AI. “It’s terrifying how easy it is to create this kind of content,” Guistolise said. “People need to know it exists — and that it must be stopped.”

LEAVE YOUR COMMENT

Your email address will not be published. Required fields are marked *