Imagine this scenario: you’re worried you may have committed a crime, so you turn to a trusted advisor — OpenAI’s blockbuster ChatGPT, say — to describe what you did and get its advice.

This isn’t remotely far-fetched; lots of people are already getting legal assistance from AI, on everything from divorce proceedings to parking violations. Because people are amazingly stupid, it’s almost certain that people have already asked the bot for advice about enormously consequential questions about, say, murder or drug charges.

According to OpenAI CEO Sam Altman, anyone’s who’s done so has made a massive error — because unlike a human lawyer with whom you enjoy sweeping confidentiality protections, ChatGPT conversations can be used against you in court.

  • Someonelol@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    7
    ·
    3 days ago

    Using a search engine for an answer now could get you in that same trap then no? Even DDG has an AI answer generated automatically.

    • some_guy@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      3
      ·
      3 days ago

      We didn’t need AI for Google search queries to show up in murder cases where people searched for how to clean blood / dna with bleach, how to dispose of bodies, and other really stupid questions. From their home computers. Pretty much right around the time that someone close to them went missing. It’s all timestamped.