Jessica Guistolise, Megan Hurley and Molly Kelley talk with CNBC in Minneapolis, Minnesota, on July 11, 2025, about fake pornographic images and videos depicting their faces made by their mutual friend Ben using AI site DeepSwap.
Jordan Wyatt | CNBC
In the summer of 2024, a group of women in the Minneapolis area learned that a male friend used their Facebook photos mixed with artificial intelligence to create sexualized images and videos.
Using an AI site called DeepSwap, the man secretly created deepfakes of the friends and over 80 women in the Twin Cities region. The discovery created emotional trauma and led the group to seek the help of a sympathetic state senator.
As a CNBC investigation shows, the rise of “nudify” apps and sites has made it easier than ever for people to create nonconsensual, explicit deepfakes. Experts said these services are all over the Internet, with many being promoted via Facebook ads, available for download on the Apple and Google app stores and easily accessed using simple web searches.
“That’s the reality of where the technology is right now, and that means that any person can really be victimized,” said Haley McNamara, senior vice president of strategic initiatives and programs at the National Center on Sexual Exploitation.
CNBC’s reporting shines a light on the legal quagmire surrounding AI, and how a group of friends became key figures in the fight against nonconsensual, AI-generated porn.
Here are five takeaways from the investigation.
The women lack legal recourse
Because the women weren’t underage and the man who created the deepfakes never distributed the content, there was no apparent crime.
“He did not break any laws that we’re aware of,” said Molly Kelley, one of the Minnesota victims and a law student. “And that is problematic.”
Now, Kelley and the women are advocating for a local bill in their state, proposed by Democratic state Senator Erin Maye Quade, intended to block nudify services in Minnesota. Should the bill become law, it would levy fines on the entities enabling the creation of the deepfakes.
Maye Quade said the bill is reminiscent of laws that prohibit peeping into windows to snap explicit photos without consent.
“We just haven’t grappled with the emergence of AI technology in the same way,” Maye Quade said in an interview with CNBC, referring to the speed of AI development.
The harm is real
Jessica Guistolise, one of the Minnesota victims, said she continues to suffer from panic and anxiety stemming from the incident last year.
Sometimes, she said, a simple click of a camera shutter can cause her to lose her breath and begin trembling, her eyes swelling with tears. That’s what happened at a conference she attended a month after first learning about the images.
“I heard that camera click, and I was quite literally in the darkest corners of the internet,” Guistolise said. “Because I’ve seen myself doing things that are not me doing things.”
Mary Anne Franks, professor at the George Washington…
Read More: 5 takeaways from CNBC’s investigation into ‘nudify’ apps and sites