midian18

Posts: 10,039   +134
Staff member
What just happened? One of the most sinister trends to come from the advancement of AI image generation in recent years is the rise of websites and apps that can "undress" women and girls. Now, The San Francisco City Attorney's office is suing 16 of these most-visited sites with the aim of shutting them down.

The suit was the idea of Yvonne Meré, chief deputy city attorney in San Francisco, who had read about boys using "nudification" apps to turn photos of their fully clothed female classmates into deepfake pornography. As the mother of a 16-year-old girl, Meré wanted to do something about the issue, so rallied her co-workers to craft a lawsuit aimed at shutting down 16 of the most popular unclothing websites, writes the New York Times.

The complaint, which has been published with the websites' names redacted, states that the sites were collectively visited 200 million times during the first six months of 2024. One of these undressing sites advertises: "Imagine wasting time taking her out on dates, when you can just use [the redacted website] to get her nudes."

City Attorney David Chiu said that the sites' AI models have been trained using real pornography and images depicting child abuse to create the deepfakes. He added that once the images were circulating, it was almost impossible to tell which website had created them.

The suit argues that the sites violate state and federal revenge pornography laws, state and federal child pornography laws, and the California Unfair Competition Law.

"This investigation has taken us to the darkest corners of the internet, and I am absolutely horrified for the women and girls who have had to endure this exploitation," Chiu said on X. "This is a big, multi-faceted problem that we, as a society, need to solve as soon as possible."

The problem of using AI to create nude images of people without their consent goes back a long time – a deepfake bot on Telegram was found to have made over 100,000 faked naked photos of women based on social media images in 2020.

Recent advances in generative AI have exacerbated the deepfake issue, making the images appear even more realistic. The explicit Taylor Swift pictures that were shared online in January led to US lawmakers calling for action and Google banning ads for deepfake porn and undressing sites.

Earlier this month, a new bipartisan bill proposed holding entities accountable for producing non-consensual "digital replicas" of people. The Nurture Originals, Foster Art, and Keep Entertainment Safe Act of 2024 (NO FAKES Act) will hold individuals and companies liable for damages if they create, host, or share unconsented AI-generated audio or visual depictions of a person.

Permalink to story:

 
Yes, imagine 'that girl' naked, doing things to her... and then later actually taking her out in real life, and getting shocked that she looks NOTHING LIKE what the computer thought, that you run away...
 
Expect to see a surge in users and downloads before the Fed. bill passes. Of course there will be foreign sites popping up? Let's not forget the dark web.
 
Ah yes, good old San Francisco. The city full of feces on the streets and drug needles, is worried about websites creating nudes. What a great way to waste taxpayer money .
 
Some of you here in the comments do not understand where this seems to lead to: a generater that is able to create nudes of people and even children. Now I know it does not sound that appealing, but there will be a turning point that the same models can be trained to such extends that I could create one of your wife's as we speak, and message you over social media that your wife is sending me nudes. Could you for one second imagine the troubles it brings in family's or people in general? Where one has to defend him or herself that that is not him or her nude?

Doxing is real - and above is a huge problem.
 
Some of you here in the comments do not understand where this seems to lead to: a generater that is able to create nudes of people and even children. Now I know it does not sound that appealing, but there will be a turning point that the same models can be trained to such extends that I could create one of your wife's as we speak, and message you over social media that your wife is sending me nudes. Could you for one second imagine the troubles it brings in family's or people in general? Where one has to defend him or herself that that is not him or her nude?

Doxing is real - and above is a huge problem.
I hope I am sufficiently ugly so nobody would ever do this to me.
 
What I learned from this is to get a tattoo in a non-visible area so that I can undeniably refute claims of a nude being mine when a deepfake inevitably fails to include the tattoo.
 
What I learned from this is to get a tattoo in a non-visible area so that I can undeniably refute claims of a nude being mine when a deepfake inevitably fails to include the tattoo.
Erm seems a bit extreme. I like it you think that nude pics of you will created and are trying to think ahead lol.
 
Last edited:
What I learned from this is to get a tattoo in a non-visible area so that I can undeniably refute claims of a nude being mine when a deepfake inevitably fails to include the tattoo.
How could you prove it was a fake, unless you showed your tattoo?
 
If anyone did a fake photo of my wife and sent it to me it would be so easy to tell it was fake. I'd be like "oh look they got rid of the C-Section scars and the big belly. And when did she get a six pack?"

Ive seen all those AI porn pics and have to say those bodies are pretty awesome. But has anyone created AI porn with an actual realistic body? Like all the sites that used to show you actresses nude photos from their various movies. Real actresses have real bodies. Easy to tell the difference between the real ones and the AI ones.

My GF however does have a 6 pack but I'm pretty sure I could still tell the difference.
 
How could you prove it was a fake, unless you showed your tattoo?
I'm pretty sure the only person youd need to prove it was fake to would be a partner who has already seen the tattoo... I think thats the gist.
 
I quoted Vanderlinde. My own comment is below this quote. I have only 1 account. Do not attribute Vanderlinde's words to me. :mad:
 

Similar threads