Julia, 21, has received fake nude photos of herself generated by artificial intelligence. The phenomenon is exploding.
“I’d already heard about deepfakes and deepnudes (…) but I wasn’t really aware of it until it happened to me. It was a slightly anecdotal event that happened in other people’s lives, but it wouldn’t happen in mine”, thought Julia, a 21-year-old Belgian marketing student and semi-professional model.
At the end of September 2023, she received an email from an anonymous author. Subject: "Realistic? “We wonder which photo would best resemble you”, she reads.
Attached were five photos of her.
In the original content, posted on her social networks, Julia poses dressed. In front of her eyes are the same photos. Only this time, Julia is completely naked.
Julia has never posed naked. She never took these photos. The Belgian model realises that she has been the victim of a deepfake.
I seriously don’t get why society cares if there are photos of anyone’s private parts.
I imagine those people are humiliated.
They are humiliated only because society has fed them the idea that what they’ve done (in this case not done but happened to them) is wrong. Internalizing shame meted out by society is the real psychological problem we need to fix.
Society does indeed play a big role, but if someone went around telling lies about you that everyone believed regardless of how much you denied it, that would take a toll on you.
Who are you tell people how they ought to feel? The desire for privacy is perfectly normal and you are the one trying to shame people for not wanting naked pictures of themselves everywhere.
That’s fair.
That’s what I meant. Why should it be shameful? If it weren’t, those photos would lose so much of their harm.
I’ll continue this conversation in good faith only after you’ve shown us yours to prove your position.
Modern surveillance capitalism has made sharing of private data normalised. These days we are very used to sharing pretty much everything about ourselves, in addition to having no control over how that information is used. That is a bad thing.
I suspect that these tools will, similarly, make nudity a bit more normalised in many societies across the world. That is probably a good thing overall.
What you mean to say is that non consensual nude pictures of women will be normalised and you’re ok with that. Sexual assault and domestic violence are also pretty common, you want to normalise those too?
In my opinion, an eventual loss of prudishness is just a silver lining on the cloud of surveillance capitalism.
This is exactly the same as slut shaming just in the opposite direction. Prude is a word designed to shame women who refuse to be sex objects.
Hilarious when you reactionary types try to pretend that giving women greater agency is somehow repressing women.
Forcing women to accept fake porn of themselves does not give women agency. It does the opposite.
Keep your screeching to church, please. We don’t need to hear it.
Im an atheist. If you don’t want to hear differing opinions then what are you doing on a message board?
Keep jerkin to nonconsensual porn.
Yes, no and no.