Julia, 21, has received fake nude photos of herself generated by artificial intelligence. The phenomenon is exploding.
“I’d already heard about deepfakes and deepnudes (…) but I wasn’t really aware of it until it happened to me. It was a slightly anecdotal event that happened in other people’s lives, but it wouldn’t happen in mine”, thought Julia, a 21-year-old Belgian marketing student and semi-professional model.
At the end of September 2023, she received an email from an anonymous author. Subject: "Realistic? “We wonder which photo would best resemble you”, she reads.
Attached were five photos of her.
In the original content, posted on her social networks, Julia poses dressed. In front of her eyes are the same photos. Only this time, Julia is completely naked.
Julia has never posed naked. She never took these photos. The Belgian model realises that she has been the victim of a deepfake.
Is this different to how people would edit Britney Spears’ face onto porn stars in the 90s?
It’s much easier to do now. You should be able to do several in a single minute and the barrier to entry of using the software is way lower than Photoshop. Legally though, these seem indistinguishable.
They’re easier to create and more realistic. The prevalence and magnitude of an immoral act impacts how it should be legislated. Personally I don’t care if people make these and keep it to themselves, but as soon as you spread it I think it’s immoral and harassment and there should be laws to prevent it.
Probably should have sued those people too… People need to cut this shit out. You’re fucking with others people’s life’s.
They’re not going to. There is an insane amount of entitlement around people’s jerk off material. Right here on Lemmy, I’ve seen someone (who denied being a child) call pornography a “human right” and groups of people insisting they should be able to openly trade images of child rape as long as they’re AI generated.
Fuck you people who equate pornography with child porn. You know what you’re doing, you sick bastards.
Pornography is not at all the same thing as child porn. Do not speak about them in the same way.
I didn’t, but don’t let that stop you throwing a tantrum and proving my point.
That sounds like an insane amount of entitlement from the one guy you found. Hopefully that entitles you to ignore everyone with even a fraction more nuance.
How dare I ignore the many subtle layers of nuance in “Using AI to create pornographic images of a woman and then sending them to her so she knows you’ve done it”.
and groups of people insisting they should be able to openly trade images of child rape as long as they’re AI generated.
“Be able to” in what sense? Morally and ethically? No, absolutely not obviously. But what would the legal reason be to make it illegal since no actual children were involved? If I paint an explicit painting of a child being raped, is that illegal? I don’t think it would be. It would certainly give people good reason to be suspicious of me, but would it be illegal? And would an AI-generated image really be different?
But what would the legal reason be to make it illegal since no actual children were involved
Prove it. Trawl through thousands and thousands of images and videos of child sexual assualt and tell me which ones were AI generated and which were not. Prove the AI hadn’t been set up to produce CSAM matching a real child’s likeness. Prove it won’t normalize and promote the sexual assault of real children. Prove it wasn’t trained on images and videos of real children being raped.
Legalising AI-generated child pornography is functionally identical to legalising all child pornography.
Legalizing or already legal? Because that’s my question. I don’t think it would be illegal, at least not in the U.S. I can’t speak for other countries, but here, proving a negative in court isn’t a thing.
I think porn generation (image, audio and video) will eventually be very realistic and very easy to make with only a few clicks and some well crafted prompts. Things would just be a whole other level that what Photoshop used to be.
This is going to be a serious issue in the future - either society changes and these things are going to be accepted or these kind of generating ai models have to be banned. But that’s still not going to be a “security” against it…
I also think we have to come up with digital watermarks that are easy to use…
I think there’s a big difference between creating them and spreading them, and putting punishments on spreading nudes against someone’s will, real or fake is a better 3rd option. The free speech implications of banning software that’s capable of creating them is too broad and fuzzy, but I think that putting harsh penalties on spreading them on the grounds of harassment would be clear cut and effective. I didn’t see a big difference in between spreading revenge porn and deep fakes and we already have laws against spreading revenge porn.
With ai and digital art… What is real? What is a person? What is a cartoon or a similar but not same likeness? In some cases what even is nudity? How old is an ai image? How can anything then be legal or illegal?
IF CHEWBACCA LIVES ON ENDOR, YOU MUST ACQUIT
We gotta ban photo editing software too. Shit, we gotta ban computers entirely. Shit, now we have to ban electricity.
We should ban drawing too
Pencils are the tool of Satan!!!
deleted by creator
Where did it say anything about a Ministry of Truth deciding what can be posted online? Making it illegal and having a 3rd party decide if every post is allowed are two very different things
If it’s illegal then there are ramifications for the platform, the user posting it, and the tool that created it.
Content moderation is already a thing so it’s nothing new. Just one more thing on the list to check for when a post is reported
Making it illegal and having a 3rd party decide if every post is allowed are two very different things
Depends on the scale. If you’re a black man in the South in 1953, having a 3rd party decide whether you can do something means you can’t do that thing.
I’m not speaking to this particular topic, just saying in general 3rd parties can be corrupted. It’s not a foolproof solution or even always a good idea.
I agree. It’s a terrible idea for many reasons. The fact that we can’t trust something like that to run in good faith is among the top of those reasons.
The comment I was responding to was saying this proposed law would strip our ability to speak our mind because it would create a new 3rd party group that would validate each post before allowing them online.
I was pointing out that making specific content illegal is not the same as having every post scrutinized before it goes live.
deleted by creator
Well, you’re about 20 years too late. It has already started
See any of the tor sites for examples of what is currently filtered out of the regular internet. It even gets your google account permanently banned if you log in via the tor browser
Yeah, sorry - I disagree on every level with your take.
I am also convinced that at least the LLMs will soon destroy themselves, due to the simple fact that “garbage in, garbage out”.
deleted by creator
It’s not a serious issue at all.
Of course, if you’re the kind of greedy/lazy person who wants to make money off of pictures of their body, you’re going to have to find a real job.
I seriously don’t get why society cares if there are photos of anyone’s private parts.
I imagine those people are humiliated.
They are humiliated only because society has fed them the idea that what they’ve done (in this case not done but happened to them) is wrong. Internalizing shame meted out by society is the real psychological problem we need to fix.
Society does indeed play a big role, but if someone went around telling lies about you that everyone believed regardless of how much you denied it, that would take a toll on you.
Who are you tell people how they ought to feel? The desire for privacy is perfectly normal and you are the one trying to shame people for not wanting naked pictures of themselves everywhere.
That’s fair.
That’s what I meant. Why should it be shameful? If it weren’t, those photos would lose so much of their harm.
I’ll continue this conversation in good faith only after you’ve shown us yours to prove your position.
Modern surveillance capitalism has made sharing of private data normalised. These days we are very used to sharing pretty much everything about ourselves, in addition to having no control over how that information is used. That is a bad thing.
I suspect that these tools will, similarly, make nudity a bit more normalised in many societies across the world. That is probably a good thing overall.
What you mean to say is that non consensual nude pictures of women will be normalised and you’re ok with that. Sexual assault and domestic violence are also pretty common, you want to normalise those too?
In my opinion, an eventual loss of prudishness is just a silver lining on the cloud of surveillance capitalism.
This is exactly the same as slut shaming just in the opposite direction. Prude is a word designed to shame women who refuse to be sex objects.
Hilarious when you reactionary types try to pretend that giving women greater agency is somehow repressing women.
Forcing women to accept fake porn of themselves does not give women agency. It does the opposite.
Keep your screeching to church, please. We don’t need to hear it.
Yes, no and no.
Someone at TikTok has all the power to make nudes off every one in the planet except for 5 homeless guys from LA that you don’t want a nude from anyway. Tiktok has the images of you (you idiot) and the hardware and software required to fake you to everyone you know.
Welcome to China 2.0!
Pal, what the fuck are you talking about? TikTok and China are not mentioned anywhere in this article and nowhere on TikTok is there an option to generate anyone’s likeness, clothed or unclothed.