Deepfake Porn Is Out of Control::New research shows the number of deepfake videos is skyrocketing—and the world’s biggest search engines are funneling clicks to dozens of sites dedicated to the nonconsensual fakes.

  • CatZoomies@lemmy.world
    link
    fedilink
    English
    arrow-up
    55
    arrow-down
    7
    ·
    edit-2
    1 year ago

    This is a sad article to read. I’m not a woman nor am I young adult growing up with all this technology that can be leveraged against me. Could you imagine being a junior high or high school student, and having an anonymous classmate creating deepfake porn of you using your yearbook photo? And the children in your class gossiping about you, sharing your porn video/photo online with their friends, and enduring that harassment? It’s already well-documented what damages that too much pornography causes on our psychological development, now imagine the consumer of this content being around the victim. That harassment can get so much worse.

    I can’t even begin to fathom what kind of psychological damage this will cause to the youth. I feel for women everywhere - this is a terrible thing people are doing with this technology. I can’t imagine raising a daughter in this environment and trying to help her navigate this problem when some asshole creates deepfake porn of her. My niece is currently getting bullied in school - what if her bullies use these tools against her? This just makes my blood boil.

    It’s bad enough that since social media has risen and captured the attention span of kids and teenagers, that there is a well-defined correlation with an increase in suicide rates since 2009 (the year Twitter first came out). https://www.health.com/youth-suicide-rate-increase-cdc-report-7551663 . Now, a nonconsensual AI-generated porn era to navigate.

    These are dangerous times. This opens persons up for attack, and regulation to increase friction to access these tools is one of the next most important steps to take. Granted, outright bans never work (as the persistent ones will always get their hands on it), but we need to put controls into place to limit access to this. Then we can remediate the root cause to these problems (e.g., proper systemic education, teaching a modified sexual education in schools to address things like consent, etc.).

    EDIT:

    Wanted to also add after I posted this, that a common prevalent argument I hear parroted by people is this:

    • People are gonna do this AI generation anyway. It’ll get to the point that you won’t be able to tell what’s real or not, so women can just deny it. You can’t prove it’s real anyway, so why bother?

    This is another way of saying “boys will be boys” and ignoring the problem. The problem is harrassment and violence against women.

  • alienanimals@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    1
    ·
    1 year ago

    AI and deepfakes aren’t going to stop. Schools need to get with the times rather than pretending like it’s the year 1960.

    Teachers should be able to deliver meaningful punishments to students. If someone gets caught passing these around, then that person should catch some flak. And none of that punishing the victim and the bully like most schools do.

  • Heratiki@lemmy.ml
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    2
    ·
    edit-2
    1 year ago

    How would you police this without direct abuse?

    It’s pretty easy to spot deep fakes, even now. The type of porn being created in deep fakes are just too unbelievable when it comes to actors and actresses. They’re not deep faking intimate love porn, it’s nearly always straight up deep hardcore pornography that is being made when deep fakes are involved. I feel like everything described that is so evil is just a straw man argument. Hell anyone that believes deep hardcore pornography is what just happens in reality is a moron. The amount of bullshit incest porn on these same websites is just bonkers. That being said I can see how it can affect some people.

    But guess what? Humans tend to look similar so how do you go stop it when you don’t know if it’s real or fake? How crazy easy will it be to create yet another advantage to those in power/financial success? Examples:

    Politician is seeing a prostitute and abusing his status to do so. The prostitute records a secret sex tape of him raping her and threatening to have her arrested if she doesn’t submit to what he wants. This video goes public. Politician claims it’s a deep fake and prostitute is arrested anyway. Or the reverse. A prostitute deep fakes the video and threatens the politician. The politician just had information coming out of him glancing at another woman than his wife before the deep fake and so the populace just sides with the prostitute and the politician is arrested.

    Or how about a woman who looks just like Taylor Swift decides she wants to work in pornography. Her likeness is immediately noticed and it’s part of her popularity but not billed as such. T swizzle claims it’s a deep fake to disparage her and the porn actress is ruin if not sued into oblivion.

    So many scenarios could go either way. You can’t ban the technology because you’ll never be able to legitimately be able to know which is which. And just like cryptography banning it will not limit access to those who would use it lawfully.

    So what’s the solution? Get over the lunacy of the whole event? What options do we really have? And being we don’t have many/any options all we’re doing is sending clicks to news sites who have nothing else to write about. I’m not saying it’s not a problem, just not seeing a solution and don’t see a need to continually beat a dead horse.

  • trackcharlie@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Society’s views on sexuality will change before we will EVER get a serious handle on deepfakes. If you’re rich and can afford the lawyers, go ham and sue, otherwise, time to just accept that humans are animals and animals fuck.

    Whether or not someone is or is not in a porn video is less important than whether or not they can do whatever job or task they’ve been given.

    Religious puritanical morons and prudes need to stfu and get over it, the victims need to cope with reality that this is never going away and they can spend their entire life and fortune on ‘finding the one who did this’ or just move on and put energy into something worthwhile.

    Even complaining about this is hysterically moronic. The ‘big threat’ is fake porn.

    Fixing the child care system so that child abuse, emotional, physical and sexual gets reduced even 1% would be immensely more worthwhile a task than literally any ending of a pursuit against a technology that is open source and widely available, not to mention even if it was made illegal in your country, good luck actually enforcing a law like that without going 110% dystopia with locked down internet that would make current chinese life look like a kind big brother system.

    • echo64@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      ·
      1 year ago

      So, I think it’s good to identify why non-consenting pornography is considered a bad thing. And why it being considered a bad thing is different than what you are saying here.

      Deep fake pornography for the people targeted (which is not just famous people) is incredibly invasive. Your image is out there doing things that you would be horrified to have on camera. It can destroy people’s health and cause huge problems, especially for people who are being harassed by others.

      It’s not Pearl clutching. It’s a rather damaging technology that benefits no one.

      • fubo@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        1 year ago

        By default, creating and publishing “deepfake porn” of a real person constitutes defamation against that person, as it carries the false statement “this person posed for this picture” which is likely to cause that person harm. Often, the intention is to cause harm.

        As such, we don’t need new laws here. Existing laws against defamation just need to be applied.

        • sfgifz@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          1 year ago

          Let’s see you put your money where your mouth is after we get some deep fakes of your turd eating fetishes get sent to your friends.