jeffw@lemmy.worldM to News@lemmy.world · 1 year ago“CSAM generated by AI is still CSAM,” DOJ says after rare arrestarstechnica.comexternal-linkmessage-square71fedilinkarrow-up1229arrow-down18
arrow-up1221arrow-down1external-link“CSAM generated by AI is still CSAM,” DOJ says after rare arrestarstechnica.comjeffw@lemmy.worldM to News@lemmy.world · 1 year agomessage-square71fedilink
minus-squareover_clox@lemmy.worldlinkfedilinkarrow-up9arrow-down33·1 year agoThen we should be able to charge AI (the developers moreso) for the same disgusting crime, and shut AI down.
minus-squarejeffw@lemmy.worldOPMlinkfedilinkarrow-up14arrow-down1·1 year agoI think that’s a bit of a stretch. If it was being marketed as “make your fantasy, no matter how illegal it is,” then yeah. But just because I use a tool someone else made doesn’t mean they should be held liable.
minus-squareover_clox@lemmy.worldlinkfedilinkarrow-up2arrow-down9·1 year agoCheck my other comments. My thought was compared to a hammer. Hammers aren’t trained to act or respond on their own from millions of user inputs.
minus-squarejeffw@lemmy.worldOPMlinkfedilinkarrow-up4·1 year agoI learned how to write by reading. The AI did the same, more or less, no?
minus-squareover_clox@lemmy.worldlinkfedilinkarrow-up1arrow-down5·1 year agoThe AI didn’t learn to draw or generate photos from blind words though…
minus-squarejeffw@lemmy.worldOPMlinkfedilinkarrow-up4·1 year agoOh, it learned from art? Like how human artists learn?
minus-squareover_clox@lemmy.worldlinkfedilinkarrow-up1arrow-down6·1 year agoAI hasn’t exactly kicked out a Picasso with a naked young girl missing an ear yet has it? I sure hope not! But if it can, then that seriously indicates it must have some bad training data in the system… I won’t be testing these hypotheses.
minus-squarexmunk@sh.itjust.workslinkfedilinkarrow-up2arrow-down5·1 year agoIt in fact does have bad training data! https://cyber.fsi.stanford.edu/news/investigation-finds-ai-image-generation-models-trained-child-abuse
minus-squareover_clox@lemmy.worldlinkfedilinkarrow-up2arrow-down3·1 year agoThank you for posting a relevant link. It’s disappointing that such data is any part of any public AI systems… ☹️
minus-squareover_clox@lemmy.worldlinkfedilinkarrow-up1arrow-down1·1 year agoI’d rather not fart bullets, but thank you for inviting me to the party.
minus-squareover_clox@lemmy.worldlinkfedilinkarrow-up1·1 year agoTotally dismissing inappropriate usage, AI can be funny and entertaining, but on the flip side it’s also taking people’s jobs. It shouldn’t take a book, let alone 3 seconds of common sense thought, to realize that.
Then we should be able to charge AI (the developers moreso) for the same disgusting crime, and shut AI down.
I think that’s a bit of a stretch. If it was being marketed as “make your fantasy, no matter how illegal it is,” then yeah. But just because I use a tool someone else made doesn’t mean they should be held liable.
Check my other comments. My thought was compared to a hammer.
Hammers aren’t trained to act or respond on their own from millions of user inputs.
I learned how to write by reading. The AI did the same, more or less, no?
The AI didn’t learn to draw or generate photos from blind words though…
Oh, it learned from art? Like how human artists learn?
AI hasn’t exactly kicked out a Picasso with a naked young girl missing an ear yet has it?
I sure hope not!
But if it can, then that seriously indicates it must have some bad training data in the system…
I won’t be testing these hypotheses.
It in fact does have bad training data! https://cyber.fsi.stanford.edu/news/investigation-finds-ai-image-generation-models-trained-child-abuse
Thank you for posting a relevant link. It’s disappointing that such data is any part of any public AI systems… ☹️
Can we do guns next?
I’d rather not fart bullets, but thank you for inviting me to the party.
deleted by creator
Totally dismissing inappropriate usage, AI can be funny and entertaining, but on the flip side it’s also taking people’s jobs.
It shouldn’t take a book, let alone 3 seconds of common sense thought, to realize that.