An AI avatar made to look and sound like the likeness of a man who was killed in a road rage incident addressed the court and the man who killed him: “To Gabriel Horcasitas, the man who shot me, it is a shame we encountered each other that day in those circumstances,” the AI avatar of Christopher Pelkey said. “In another life we probably could have been friends. I believe in forgiveness and a God who forgives. I still do.”

It was the first time the AI avatar of a victim—in this case, a dead man—has ever addressed a court, and it raises many questions about the use of this type of technology in future court proceedings.

The avatar was made by Pelkey’s sister, Stacey Wales. Wales tells 404 Media that her husband, Pelkey’s brother-in-law, recoiled when she told him about the idea. “He told me, ‘Stacey, you’re asking a lot.’”

  • Jarix@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    7 hours ago

    This wasn’t testimony, it was an impact statement.

    Impact statements are wild and crazy and this isn’t surprising in anyway

    • Phoenixz@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 hours ago

      No, this wasn’t an impact statement either.

      This was a huntch of pixels moved around by a huge wasteful amount of CPU power. The actual victim is dead, he can’t talk and people are putting words in his mouth and it shouldn’t be allowed.

      • Jarix@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        51 minutes ago

        It was literally in the article explaining that this was presented as the victim impact statement.

        Have you learned nothing about modern “news” ? Dont be part of the problem of spreading misinformation, be diligent and responsible. And ita okay to make mistakes, own them and move forward. Its not easy to get your information correct everytime, theres no shame in that, only in ignoring your responsibility to self correct voluntarily when you find out

        Peace be upon you, we need to work together, because even though I’m calling out the inaccuracy in your comment, i do believe using this technology for this purpose is heinous

        Edit: from the NPR article as its not paywalled

        But the use of AI for a victim impact statement appears novel, according to Maura Grossman, a professor at the University of Waterloo who has studied the applications of AI in criminal and civil cases. She added, that she did not see any major legal or ethical issues in Pelkey’s case.

        “Because this is in front of a judge, not a jury, and because the video wasn’t submitted as evidence per se, its impact is more limited,” she told NPR via email.