A mother and her 14-year-old daughter are advocating for better protections for victims after AI-generated nude images of the teen and other female classmates were circulated at a high school in New Jersey.

Meanwhile, on the other side of the country, officials are investigating an incident involving a teenage boy who allegedly used artificial intelligence to create and distribute similar images of other students – also teen girls - that attend a high school in suburban Seattle, Washington.

The disturbing cases have put a spotlight yet again on explicit AI-generated material that overwhelmingly harms women and children and is booming online at an unprecedented rate. According to an analysis by independent researcher Genevieve Oh that was shared with The Associated Press, more than 143,000 new deepfake videos were posted online this year, which surpasses every other year combined.

  • foggy@lemmy.world
    link
    fedilink
    arrow-up
    72
    arrow-down
    5
    ·
    1 year ago

    Methinks this problem is gonna get out of fucking hand. Welcome to the future, it sucks.

    • Goldmage263@sh.itjust.works
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      1 year ago

      AI is out of the bag for all the good and bad it will do. Nothing will be safe on the internet, and hasn’t been for a long time now. Either we will get government monitored AI results or use AI to combat misuse of AI. Either way isn’t preventative. The next wild west frontier is upon us, and it’s full of bandits in hiding.

  • Marxism-Fennekinism@lemmy.ml
    link
    fedilink
    English
    arrow-up
    33
    ·
    edit-2
    1 year ago

    Maybe I’m just naive of how many protections we’re actually granted but shouldn’t this already fall under CP/CSAM legislation in nearly every country?

  • Aceticon@lemmy.world
    link
    fedilink
    arrow-up
    32
    arrow-down
    2
    ·
    edit-2
    1 year ago

    There might be an upside to all this, though maybe not for these girls: with enough of this people will eventually just stop believing any nude pictures “leaked” are real, which will be a great thing for people who had real nude pictures leaked - which, once on the Internet, are pretty hard to stop spreading - because other people will just presume they’re deepfakes.

    Mind you, it would be a lot better if people in general culturally evolved beyond being preachy monkeys who pass judgment on others because they’ve been photographed in their birthday-suit, but that’s clearly asking too much so I guess simply people assuming all such things are deepfakes until proven otherwise is at least better than the status quo.

  • ZombiFrancis@sh.itjust.works
    link
    fedilink
    arrow-up
    27
    arrow-down
    2
    ·
    1 year ago

    In previous generations the kid making fake porn of their classmates was not a well liked kid. Is that reversed now? On the basis of quality of tech?

    • Omega@lemmy.world
      link
      fedilink
      arrow-up
      18
      arrow-down
      1
      ·
      1 year ago

      That kid that doodles is creepy. But deep fakes probably feel a lot closer to actual nudes.

  • calypsopub@lemmy.world
    link
    fedilink
    arrow-up
    42
    arrow-down
    18
    ·
    1 year ago

    So as a grown woman, I’m not getting why teenage girls should give any of this oxygen. Some idiot takes my head and pastes it on porn. So what? That’s more embarrassing for HIM than for me. How pathetic that these incels are so unable to have a relationship with an actual girl. Whatever, dudes. Any boy who does this should be laughed off campus. Girls need to take their power and use it collectively to shame and humiliate these guys.

    I do think anyone who spreads these images should be prosecuted as a child pornographer and listed as a sex offender. Make an example out of a few and the rest won’t dare to share it outside their sick incels club.

    • WoahWoah@lemmy.world
      link
      fedilink
      arrow-up
      48
      arrow-down
      1
      ·
      1 year ago

      That’s fine and well. Except they are videos, and it is very difficult to prove they aren’t you. And the internet is forever.

      This isn’t like high school when you went to high school.

      Agreed on your last paragraph.

      • Margot Robbie@lemmy.world
        link
        fedilink
        arrow-up
        19
        arrow-down
        4
        ·
        1 year ago

        Then nude leak scandals will quickly become a thing of the past, because now every nude video/picture can be assumed to be AI generated and are always fake until proven otherwise.

        That’s the silver lining of this entire ordeal.

        Again, this is a content distribution problem more than an AI problem, the liability should be on those who willingly host these deepfake content than on AI image generators.

        • finestnothing@lemmy.world
          link
          fedilink
          arrow-up
          10
          arrow-down
          2
          ·
          1 year ago

          That would be great in a perfect world, but unfortunately public perception is significantly more important than facts when it comes to stuff like this. People accused of heinous crimes can and do lose friends, their jobs, and have their life ruined even if they prove that they are completely innocent

          Plus, something I’ve already seen happen is someone says a nude is fake and are then told they have to prove that it’s fake to get people to believe them… which is very hard without sharing an actual nude that has something unique about their body

        • toonicycle@lemmy.world
          link
          fedilink
          arrow-up
          4
          arrow-down
          2
          ·
          1 year ago

          I mean they obviously shouldn’t have to, but if nude photos of you got leaked in your community, people would start judging you negatively, especially if you’re a young woman. Also in these cases where they aren’t adults it would be considered cp.

      • calypsopub@lemmy.world
        link
        fedilink
        arrow-up
        2
        arrow-down
        3
        ·
        1 year ago

        I wasn’t very representative even when I WAS a teenager. I was bullied quite a bit, though.

        • atzanteol@sh.itjust.works
          link
          fedilink
          arrow-up
          5
          ·
          edit-2
          1 year ago

          And can you imagine those bullies creating realistic porn of you and sharing it with everyone at school? You may have been strong enough to endure that - but it’s pretty unrealistic to expect everyone to be able to do so. And it’s not a moral failing if somebody is unable to. This is the sort of thing that leads to suicides.

  • TheEighthDoctor@lemmy.world
    link
    fedilink
    arrow-up
    22
    arrow-down
    3
    ·
    1 year ago

    What’s the fundamental difference between a deep fake and a good Photoshop and why do we need more laws to regulate that?

  • Gork@lemm.ee
    link
    fedilink
    arrow-up
    19
    arrow-down
    1
    ·
    1 year ago

    President Joe Biden signed an executive order in October that, among other things, called for barring the use of generative AI to produce child sexual abuse material or non-consensual “intimate imagery of real individuals.” The order also directs the federal government to issue guidance to label and watermark AI-generated content to help differentiate between authentic and material made by software.

    Step in the right direction, I guess.

    How is the government going to be able to differentiate authentic images/videos from AI generated ones? Some of the AI images are getting super realistic, to the point where it’s difficult for human eyes to tell the difference.

    • apex32@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      1 year ago

      That’s a cool quiz, and it’s from 2022. I’m sure AI has improved since then. Would love to see an updated version.

    • CommanderCloon@lemmy.ml
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      I wouldn’t call this a step in the tight direction. A call for a step yeah, but it’s not actually a step until something is actually done

  • NightAuthor@lemmy.world
    link
    fedilink
    English
    arrow-up
    27
    arrow-down
    10
    ·
    1 year ago

    I wonder what the prevalence of this kind of behavior is like in countries that aren’t so weird about sex.

    • atzanteol@sh.itjust.works
      link
      fedilink
      arrow-up
      41
      arrow-down
      12
      ·
      1 year ago

      This has nothing to do with “being weird about sex” and everything to do with men treating women poorly.

      You can expect this to be worse in nations where women don’t have as many rights and/or where misogyny is accepted as part of life.

      • NightAuthor@lemmy.world
        link
        fedilink
        English
        arrow-up
        17
        arrow-down
        4
        ·
        1 year ago

        Sounds plausible, we just abolished Roe, so…. It’s not looking great for the future of this issue in the US.

  • Treczoks@lemm.ee
    link
    fedilink
    arrow-up
    19
    arrow-down
    2
    ·
    1 year ago

    The problem is how to actually prevent this. What could one do? Make AI systems illegal? Make graphics tools illegal? Make the Internet illegal? Make computers illegal?

      • Treczoks@lemm.ee
        link
        fedilink
        arrow-up
        12
        arrow-down
        2
        ·
        1 year ago

        Isn’t it already? Has it provided any sort of protection? Many things in this world are illegal, and nobody cares.

        • Jimmyeatsausage@lemmy.world
          link
          fedilink
          arrow-up
          8
          arrow-down
          5
          ·
          1 year ago

          Yes, I would argue that if CSAM was legal, there would be more of it…meaning it being illegal provides a level of protection.

          • yamanii@lemmy.world
            link
            fedilink
            arrow-up
            6
            arrow-down
            1
            ·
            1 year ago

            I wonder why are you being downvoted, something being illegal puts fear in most people to not do it.

            • 31337@sh.itjust.works
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              I’ve been wondering about this lately, but I’m not sure how much of an effect this has. There are millions of people in prison, and many of those will go on to offend again. Making things illegal can be seen as an agreement to a social contract (in a democracy), drive the activity underground (probably good thing in many cases), and prevent businesses (legal entities) from engaging in the activity; but I’m not sure how well it works on an individual level of deterrence. Like, if there were no laws, I can not really think of a law I would break that I wouldn’t already break regardless. I guess I’d just be more open about it.

              Though, people who cause harm to others should be removed from society, and ideally, quickly rehabilitated, and released back into society as a productive member.

      • CAVOK@lemmy.world
        link
        fedilink
        arrow-up
        8
        arrow-down
        2
        ·
        edit-2
        1 year ago

        It is where I’m at. Draw Lisa Simpson nude and you get a visit from the law. Dunno what the punishment is though. A fine? Jail? Can’t say.

        Edit: Apparently I was wrong, it has to be a realistic drawing. See here: 2010/0064/COD doc.nr 10335/1/10 REV 1

    • afraid_of_zombies@lemmy.world
      link
      fedilink
      arrow-up
      9
      arrow-down
      9
      ·
      1 year ago

      Require consent to take a person’s picture and hold them liable for whatever comes from them putting it on a computer.

      • jimbo@lemmy.world
        link
        fedilink
        arrow-up
        6
        arrow-down
        2
        ·
        1 year ago

        That’s a whole fucking can of worms we don’t need to open. Just make faking porn a crime similar to publishing revenge porn.

        • afraid_of_zombies@lemmy.world
          link
          fedilink
          arrow-up
          4
          arrow-down
          1
          ·
          1 year ago

          Nah. Use my image and pay me what I want. If I can’t make a Mickey Mouse movie they shouldn’t be able to make a porn staring me. Does a corporatation have more rights to an image than I have to my image?

          • jimbo@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            1 year ago

            That really depends on what you consider “using my image”. Are you going to demand that people pay you because you were wandering around in the background of their family photo or their YouTube video? Will you ask to be compensated when people post group photos that include you on their social media? Does mom owe you money for all those pictures she took of you as a kid?

      • Treczoks@lemm.ee
        link
        fedilink
        arrow-up
        5
        arrow-down
        16
        ·
        1 year ago

        You already need consent to take a persons picture. Did it help in this case? I don’t think so.

          • Treczoks@lemm.ee
            link
            fedilink
            arrow-up
            4
            arrow-down
            3
            ·
            1 year ago

            Sorry, I forgot that the US is decades behind the rest of the world in privacy laws.

            Well, maybe you could start with this aspect.

        • afraid_of_zombies@lemmy.world
          link
          fedilink
          arrow-up
          3
          arrow-down
          6
          ·
          1 year ago

          Really? Please show me the signed and notarized letter with the girl’s name on it that says they agree to have their image used for AI porn. Also since she is a minor her legal guardians.

          • CommanderCloon@lemmy.ml
            link
            fedilink
            arrow-up
            5
            arrow-down
            1
            ·
            1 year ago

            How would you possibly enforce that, or prevent people from just copying publicly available pictures for nefarious usage

            • afraid_of_zombies@lemmy.world
              link
              fedilink
              arrow-up
              2
              arrow-down
              4
              ·
              1 year ago

              It would have to be enforced after getting caught. As an add on charge. Like if an area has a rule against picking locks to commit a crime. You can never be charged with it alone but you can add that on to existing charges.

  • renrenPDX@lemmy.world
    link
    fedilink
    arrow-up
    13
    arrow-down
    1
    ·
    1 year ago

    This is treading on some dangerous waters. Kids need to realize this is way too close to basically creating underage pornography/trafficking.

  • virock@lemmy.world
    link
    fedilink
    arrow-up
    14
    arrow-down
    2
    ·
    1 year ago

    I studied Computer Science so I know that the only way to teach an AI agent to stop drawing naked girls is to… give it pictures of naked girls so it can learn what not to draw :(

    • rustydomino@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      hmmm - I wonder it makes sense to use generative AI to create negative training data for things like CP. That would essentially be a victimless way to train the AIs. Of course, that creates the conundrum of who actually verifies the AI-generated training data…

  • interceder270@lemmy.world
    link
    fedilink
    arrow-up
    12
    arrow-down
    7
    ·
    1 year ago

    I think the best way to combat this is to ostracize anyone who participates in it.

    Let it be a litmus test to see who is and is not worth hanging out with.

    • MotoAsh@lemmy.world
      link
      fedilink
      arrow-up
      13
      ·
      1 year ago

      The problem with that plan is there are too many horrible people in the world. They’ll just group up and keep going. Horrible people don’t stop over mere inconvenience.

      • interceder270@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        4
        ·
        edit-2
        1 year ago

        Yeah. Those horrible people can have a shitty life surrounded by other horrible people.

        Let them be horrible together and we can focus on the people who matter.

        • yamanii@lemmy.world
          link
          fedilink
          arrow-up
          4
          arrow-down
          1
          ·
          1 year ago

          Just like Nazis won’t go away just because you ignore them, it’s the same thing here.

    • Flying Squid@lemmy.world
      link
      fedilink
      arrow-up
      9
      arrow-down
      1
      ·
      1 year ago

      These deepfakes don’t disappear. You can ostracize all you like, but that won’t stop these from potentially haunting girls for the rest of their lives.

      I don’t know what the solution is, honestly.

      • calypsopub@lemmy.world
        link
        fedilink
        arrow-up
        3
        arrow-down
        4
        ·
        1 year ago

        Why should it haunt them? Even if the images were REAL, why should it haunt them? I’m so tired of the puritanical shame women are supposed to feel about their bodies. We all have the same basic equipment. If a guy makes a deep fake, it is HE who should feel shame and humiliation for being a sick pervert. Girls need to be taught this. Band together and laugh these idiots off campus. Name and shame online. Make sure HE will be the one haunted forever.

        • Flying Squid@lemmy.world
          link
          fedilink
          arrow-up
          7
          arrow-down
          1
          ·
          1 year ago

          I don’t mean psychologically haunt them, I mean follow them for the rest of their lives affecting things like jobs and relationships. It doesn’t matter whether or not they’re fake if people don’t think they’re fake.

          Naming and shaming who did this to them will not stop them from being fired from their schoolteaching job in 15 years when the school discovers those images. Do you think “those were fake” is going to be enough for the school corporation if it’s in, for example, Arkansas?

      • crashoverride@lemmy.world
        link
        fedilink
        arrow-up
        3
        arrow-down
        5
        ·
        1 year ago

        The solution is for no one to care or make a big deal out of it, they’re not real so you shouldn’t care.

        • Chakravanti@sh.itjust.works
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          edit-2
          1 year ago

          According to what logic? Like I’m ever going to trust some lying asshole to hide his instructions for fucking anything that’s MINE. News Alert: “Your” computer ain’t yours.

          • Olgratin_Magmatoe@startrek.website
            link
            fedilink
            arrow-up
            1
            arrow-down
            1
            ·
            edit-2
            1 year ago

            People have been trying to circumvent chatGPT’s filters, they’ll do the exact same with open source AI. But it’ll be worse because it’s open source, so any built in feature to prevent abuse could just get removed then recompiled by whoever.

            And that’s all even assuming there ever ends up being open source AI.

            • Chakravanti@sh.itjust.works
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              You’re logic is bass ackwards. Knowing the open source publicly means the shit gets fixed faster. Closed source just don’t get fixed %99 of the time because there’s only one mother fucker to do the fixing and usually just don’t do it.

              • Olgratin_Magmatoe@startrek.website
                link
                fedilink
                arrow-up
                1
                ·
                1 year ago

                You can’t fix it with open source. All it takes is one guy making a fork and removing the safeguards because they believe in free speech or something. You can’t have safeguards against misuse of a tool in an open source environment.

                I agree that closed source AI is bad. But open source doesn’t magically solve the problem.

                • Chakravanti@sh.itjust.works
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  1 year ago

                  Forks are productive. Your’re just wrong about it. I’ll take FOSS over closed source. I’ll trust the masses reviewing FOSS over the one asshole doing, or rather not doing, exactly that.