This is half a decade old news, but I only found this out myself after it accidentally came up in conversation at the DMV. The worker would not have informed me if it hadn’t come into conversation. Every DMV photo in the United States is being used for AI facial recognition, and nobody has talked about it for years. This is especially concerning given that citizens are recently being required to update their ID to a “Real ID,” which means more people than ever before are giving away the rights to their own face.

The biggest problem with privacy issues is that people talk about it for a while, but more often than not nothing ever happens to fix the problem, it simply gets forgotten. For example, in the next few years Copilot will simply become a part of people’s lives, and people will slowly stop talking about the privacy implications. What can we even do to fight the privacy practices of giants?

  • rc_buggy@sh.itjust.works
    link
    fedilink
    arrow-up
    28
    arrow-down
    1
    ·
    7 months ago

    nobody has talked about it for years

    March 2024: https://www.9news.com/article/news/investigations/police-use-colorado-dmv-facial-recognition-program/73-0c2d862c-a33c-4598-bc0a-e0d1a8ee287f

    There are rumblings here in CO about curtailing law enforcement’s use of this database. I personally would like to have subpoena protection of the database, so at least it has to go before a judge before the cops just rifle through everyone’s pictures.

    This is especially concerning given that citizens are recently being required to update their ID to a “Real ID,” which means more people than ever before are giving away the rights to their own face.

    lol, no. Real ID is just a set of requirements the federal govt. has implemented to make sure state IDs are held to the same standard as passports re: data integrity and information presented. That’s all. Not mark of the beast, not something nefarious to track you more than your old state-issued DL did in the past. As far as “recently”? Nope, states have had 20 years.

  • davel [he/him]@lemmy.ml
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    3
    ·
    7 months ago

    This is especially concerning given that citizens […]

    Not everyone with a US driver’s license is a US citizen.

    • The 8232 Project@lemmy.mlOP
      link
      fedilink
      arrow-up
      3
      arrow-down
      2
      ·
      7 months ago

      Correct, however this issue primary affects US citizens, given that driver’s licenses aren’t the only ID the DMV takes pictures for (e.g. the aforementioned Real ID)

      • edric@lemm.ee
        link
        fedilink
        arrow-up
        6
        ·
        7 months ago

        Non-citizens also use driver’s licenses and state IDs with Real ID. It’s standard regardless of citizenship.

  • tarix29@lemmy.world
    link
    fedilink
    arrow-up
    6
    ·
    7 months ago

    They also contract a company that trains facial recognition AI on social media. I think the Real ID is the least if most people’s worries

  • weariedfae@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    7 months ago

    Jokes on them, they made me take off my glasses that hide aspects of my wonky face but I wear glasses 100% of the time out in the world. You’ll never catch me, copper!

    (I know. I just like to think it can’t figure out crooked nose vs glasses. )

  • krolden@lemmy.ml
    link
    fedilink
    arrow-up
    3
    arrow-down
    7
    ·
    7 months ago

    I see no issue with the government using photo ID pictures this way, just as long as they aren’t using third parties to handle the technical aspect of it or allowing any of the data to be handled by any third parties (eg private corps). They would be stupid to ignore that large amount of known good data they could train their facial rec models on. Yes it sounds big and evil but that’s the world we live in as long as this technology exists and you want to participate in society, I guess.

    They’re collecting the data already, it’s being used this way already by everyone else, so why not?

    • The 8232 Project@lemmy.mlOP
      link
      fedilink
      arrow-up
      12
      arrow-down
      1
      ·
      edit-2
      7 months ago

      Many people’s threat models, like my own, are against mass surveillance. This falls under that category, even if it’s being handled responsibly. The issue is people have no way to opt out, and there is a lack of transparency about the use of facial recognition.

      • krolden@lemmy.ml
        link
        fedilink
        arrow-up
        3
        arrow-down
        3
        ·
        edit-2
        7 months ago

        You can be against it all you want but that doesn’t mean it’s going to matter IRL. The state of the world is that anyone with a large amount of data like this is using it to build models so they can profit and/or enforce. Even if they say they’re not doing it, they’re still doing it. Or someone with access to that data is doing it.

        Crying about the feds/DMV doing facial rec training is low hanging fruit. Obviously they’re going to be doing it along with every other government on the planet with the resources to do it. TBH there’s nothing inherently malicious about it, since them having the data they’re using is part of you having citizenship/identification in that country. The real malicious ones are the corporations contracted by said government to do the exact same thing except they’re doing their own data collection through huge networks of privately owned security cameras.

        The only way to avoid this is to go live in the woods and never come out. Any show of transparency or opting out of any of this would just be theater for you. It’s being done, has been done, and will be done without your consent or knowledge.

        • The 8232 Project@lemmy.mlOP
          link
          fedilink
          arrow-up
          4
          ·
          7 months ago

          Just because mass surveillance is already happening doesn’t mean we should accept it as our only option. While it’s true that governments and corporations are collecting data on us, there is still merit in pushing back against these practices. The point of privacy is not to hide everything and live in the woods, the point of privacy is to have control over what data you share, when you share it, and with whom you share it with. The problem isn’t facial recognition itself, the problem is living in the woods shouldn’t be the only way to avoid it. We should be able to opt out. What may seem fine to you is not always fine with others. That’s why threat models exist, after all.