In November 2021, in the city of Chandler, Arizona, Chris Pelkey was shot and killed by Gabriel Horcasitas in a road rage altercation.

Horcasitas was tried and convicted of reckless manslaughter.

When it was time for Horcasitas to be sentenced by a judge, Pelkey’s family knew they wanted to make a statement – known as a “victim impact statement” – explaining to the judge who Pelkey had been when he was alive.

They found they couldn’t get the words right.

The solution for them turned out to be having Pelkey speak for himself by creating an AI-generated avatar that used his face and voice, allowing him to “talk” directly to the judge.

[…]

  • snooggums@lemmy.world
    link
    fedilink
    English
    arrow-up
    22
    ·
    13 days ago

    The article authors are equating an avatar speaking what family thinks the victim would say with the victim speaking for himself. That is not the same thing and the judge should not have allowed it.

    • Madison420@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      13 days ago

      It’ll probably get tossed but to find if it’s legal someone actually has to try it no matter how dumb it may be.