5 points to remember from the CNBC survey on applications and “nudify” sites

Jessica Guistolise, Megan Hurley and Molly Kelley speak with CNBC in Minneapolis, Minnesota, on July 11, 2025, on the false images and pornographic videos representing their faces made by their common friend Ben using the IA site.
Jordan Wyatt | CNBC
In the summer of 2024, a group of women in the Minneapolis region learned that a male friend had used his Facebook photos mixed with artificial intelligence to create sexualized images and videos.
Using an AI site called Deepswap, the man has secretly created profound leaflets of friends and more than 80 women in the region of twin cities. The discovery created emotional trauma and led the group to ask for the help of a sympathetic state senator.
As a CNBC survey shows, the rise in applications and “nudify” sites facilitated the creation of non -consensual and explicit explicit Deepfakes. Experts have said that these services are on the Internet, many of which being promoted via Facebook advertisements, available for download in Apple and Google App stores and easily accessible using simple web research.
“This is the reality of the place where technology is at the moment, which means that anyone can really be a victim,” said Haley McNamara, the main vice-president of strategic initiatives and programs of the National Center on Sexual Exploitation.
CNBC reports highlight the legal quagmire surrounding AI, and how a group of friends has become key personalities in the fight against non -consensual porn and generated by AI.
Here are five points to remember from the investigation.
Women lack legal appeal
Because women were not minor and the man who created the Deepfakes never distributed the content, there was no apparent crime.
“He has not violated any law that we know,” said Molly Kelley, one of the victims of Minnesota and a law student. “And that’s problematic.”
Now, Kelley and women plead for a local bill in their state, proposed by the senator from the Democratic State Erin Maye Quade, intended to block the Nudification services in Minnesota. If the bill would become law, it would take fines from the entities allowing the creation of Deep Fakefakes.
Maye Quade said that the bill recalls laws that prohibit a glance in Windows to take explicit photos without consent.
“We simply did not face the emergence of AI technology in the same way,” said Maye Quade in an interview with CNBC, referring to the speed of AI development.
Evil is real
Jessica Guistolise, one of the victims of Minnesota, said that she continued to suffer from panic and anxiety from the incident last year.
Sometimes she said, a simple shutter click of the camera can make him lose his breath and start to tremble, her eyes swelling with tears. This is what happened at a conference that she attended a month after learning for the first time on the images.
“I heard this camera say, and I was literally in the darkest corners of the Internet,” said Guistolise. “Because I saw myself doing things that are not me who do things.”
Mary Anne Franks, professor at the Faculty of Law of the University of George Washington, compared the experience to the feelings that the victims of the victims describe when they were talking about the so-called revenge porn, or the publication of photos and sex videos of an online person, often by a former romantic partner.
“This gives you the impression of not having your own body, which you can never take over your own identity,” said Franks, who is also president of Civil Cyber Rights Initiative, a non -profit organization dedicated to the fight against abuse and online discrimination.
Deepfakes are easier to create than ever
Less than a decade ago, a person should be an IA expert to make explicit deep buttocks. Thanks to nudifying services, everything required is an internet connection and a Facebook photo.
Researchers said new AI models have helped inaugurate a wave of nudify services. The models are often grouped together in easy -to -use applications, so that people without technical skills can create content.
And although nudify services can contain non-liability clauses on obtaining consent, it is not clear if there is an application mechanism. In addition, many nudification sites are simply marked as so -called face exchange tools.
“There are applications that arise as fun and are in fact mainly intended for pornographic use,” said Alexios Mantzarlis, an AI security expert at Cornell Tech. “This is another wrinkle in this space.”
Nudify Service Deepswap is difficult to find
The site that has been used to create content is called Deepswap, and there is not much information on this online.
In a press release published in July, Deepswap used a Hong Kong Dateline and included a quote from Penyne Wu, which was identified in the press release as CEO and co-founder. Contact with the media was Shawn Banks, who was listed as marketing director.
CNBC could not find information online on WU and sent several emails to the address provided to the banks, but received no response.
The Deepswap website currently lists “MindSpark AI Limited” as a company name, provides an address to Dublin, and declares that its conditions of use are “governed and interpreted in accordance with the laws of Ireland”.
However, in July, the same Deepswap page had no mention of MindSpark and references to Ireland rather declared Hong Kong.
Collateral damage to AI
The Maye Quade bill, which is still taken into account, was fleeing technological companies that would offer Nudify Services $ 500,000 for each non -consensual and explicit depth that they generate in the state of Minnesota.
However, some experts are concerned about the fact that the Trump administration plans to strengthen the AI sector will know the efforts of the States.
At the end of July, Trump signed decrees as part of the Action of the White House AI, highlighting the development of AI as “national security imperative”.
Kelley hopes that all federal ia push does not compromise the efforts of the women of Minnesota.
“I fear that we will not continue to be left behind and sacrifice the altar to try to have a geopolitical race for a powerful AI,” said Kelley.
WATCH: The alarming rise in AI “nudify” applications which create explicit images of real people.

https://image.cnbcfm.com/api/v1/image/108202454-1758646478899-three_shot_3.png?v=1758925569&w=1920&h=1080