The White House wants to ‘cryptographically verify’ videos of Joe Biden so viewers don’t mistake them for AI deepfakes::Biden’s AI advisor Ben Buchanan said a method of clearly verifying White House releases is “in the works.”

  • circuitfarmer@lemmy.world
    link
    fedilink
    English
    arrow-up
    21
    arrow-down
    4
    ·
    10 months ago

    I’m sure they do. AI regulation probably would have helped with that. I feel like congress was busy with shit that doesn’t affect anything.

    • ours@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      ·
      10 months ago

      I salute whoever has the challenge of explaining basic cryptography principles to Congress.

        • wizardbeard@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          2
          ·
          10 months ago

          That’s why I feel like this idea is useless, even for the general population. Even with some sort of visual/audio based hashing, so that the hash is independant of minor changes like video resolution which don’t change the content, and with major video sites implementing a way for the site to verify that hash matches one from a trustworthy keyserver equivalent…

          The end result for anyone not downloading the videos and verifying it themselves is the equivalent of those old ”✅ safe ecommerce site, we swear" images. Any dedicated misinformation campaign will just fake it, and that will be enough for the people who would have believed the fake to begin with.

    • lemmyingly@lemm.ee
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      1
      ·
      10 months ago

      I see no difference between creating a fake video/image with AI and Adobe’s packages. So to me this isn’t an AI problem, it’s a problem that should have been resolved a couple of decades ago.