Israel has deployed a mass facial recognition program in the Gaza Strip, creating a database of Palestinians without their knowledge or consent, The New York Times reports. The program, which was created after the October 7th attacks, uses technology from Google Photos as well as a custom tool built by the Tel Aviv-based company Corsight to identify people affiliated with Hamas.

  • GrymEdm@lemmy.world
    link
    fedilink
    English
    arrow-up
    59
    arrow-down
    4
    ·
    edit-2
    9 months ago

    Israel is the type of control-heavy far-right state other dictators wish they could govern, and it’s made possible by Western money and technology (I was going to name just the US but my country of Canada, among others, is not blameless either). This news also sucks because there’s no way that tech is staying in Israel only. Citizens of the world better brace for convictions via AI facial recognition.

    “Our computer model was able to reconstruct this image of the defendant nearly perfectly. It got the hands wrong and one eye is off-center, but otherwise that’s clearly them committing the crime.”

    • wanderingmagus@lemm.eeOP
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      1
      ·
      9 months ago

      From what I remember, AI facial recognition tech was already being used by police and agencies worldwide, like the FBI, PRC police etc, or am I misinformed? I remember something about Chinese and American facial recognition software.

      • GrymEdm@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        1
        ·
        edit-2
        9 months ago

        I had not read anything like that but a quick search pulled up this story from last September by Wired that supports your post: FBI Agents Are Using Face Recognition Without Proper Training. “Yet only 5 percent of the 200 agents with access to the technology have taken the bureau’s three-day training course on how to use it, a report from the Government Accountability Office (GAO) this month reveals.” So it sounds like you’re right, and also that they are probably inadequately trained even if they complete all 3 days on how to identify people with legal ramifications.

        • wanderingmagus@lemm.eeOP
          link
          fedilink
          English
          arrow-up
          8
          arrow-down
          1
          ·
          9 months ago

          And I wonder how many of those 95% have already used misapplied AI facial recognition to justify FISA court warrants for stalking investigating random people suspected terrorists?

    • Sanctus@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      3
      ·
      edit-2
      9 months ago

      Facial tattoos of drop table commands. Embed computer worms into your iris. We can get insane to fuck all this shit up too. I bet theres a way to embed a computer virus on your own face.

      • GrymEdm@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        1
        ·
        edit-2
        9 months ago

        I guess I’ll adjust my life goals to “hot cyberpunk partner in technological dystopia”, because that sounds like some Bladerunner/Cyberpunk 2077 stuff.

        • Sanctus@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          9 months ago

          Its not that far off. We’ll see exactly what I said soon enough. You can put a virus or worm inside an image in an email. You can do the same thing with a tattoo. Its unfortunate it will be here so far before the superhuman cybernetics.

        • wanderingmagus@lemm.eeOP
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          9 months ago

          Honestly with enshittification “technological dystopia” sounds like exactly where we already are. Now, if only implants weren’t being R&D’d by Muskrat and there were some open source non-invasive version…

      • fruitycoder@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        9 months ago

        Attempts at adversial ai tatoo, face masks and clothing have been done before. Basically exploiting the model not having a deeper understanding of the world so you can trick it with specific visual artifacts.