"Like so many applications of AI, this new power is likely to be a double-edged sword: It may help people identify the locations of old snapshots from relatives, or allow field biologists to conduct rapid surveys of entire regions for invasive plant species, to name but a few of many likely beneficial applications.

“But it also could be used to expose information about individuals that they never intended to share, says Jay Stanley, a senior policy analyst at the American Civil Liberties Union who studies technology. Stanley worries that similar technology, which he feels will almost certainly become widely available, could be used for government surveillance, corporate tracking or even stalking.”

    • PoopMonster@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      Openstreetmaps let’s you write some insanely precise queries. There’s a company around that had as a plan to team up with governments to pinpoint mass shooters when they were streaming (as a usage case).

      So say in the video it was clear they were in X city and they see things in the video like McDonald’s, Starbucks, fenced in playgrounds, churches, what have you you can give the query a bounding box with all that info and very quickly narrow down where the video could be taken.

      I think there was also some people who would pinpoint images from mountain outlines as a game. Kind of like geoguessr on steroids.

  • skydivekingair@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    ·
    1 year ago

    This isn’t unique to AI, like most LLM programs it’s just accomplishing it faster and on a larger scale. Personally think if you want privacy you should limit the personal things you post to what you’re okay with being out there and form habits such as waiting until home from vacation to post pictures.

  • afraid_of_zombies@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    2
    ·
    1 year ago

    Yes, and people like me having continued to point out that this problem stems from a bad view of expectation of privacy.

    A non-famous person has a reasonable expectation of privacy on public property. If you take a photo and a non-famous person’s face is in it, you should have written consent for only that photo or blur it out. If Disney can own an image of a mouse for 95 fucking years I can own my own image.

    Don’t take pictures of people or their property without consent. Just because technology allows you to be a disgusting creep doesn’t mean you should. If you want jerk off material just use the internet like the rest of us.

  • AFK BRB Chocolate@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    ·
    1 year ago

    To get that kind of accuracy from a student project with such a small sample set is pretty remarkable and pretty frightening. Yes, there are people who are good at this, but (1) this AI just beat one of the most skilled humans and (2) having it in an AI brings the capability to anyone, regardless of their motives.

    Plus, with an AI you can incorporate more heuristics than any human could reasonably master. The article mentions types of foliage, which is a good example. An AI could incorporate thousands of things like that easily. Seems like a tool that’s ripe for abuse, but I don’t know what you could do about it.