An AI avatar made to look and sound like the likeness of a man who was killed in a road rage incident addressed the court and the man who killed him: “To Gabriel Horcasitas, the man who shot me, it is a shame we encountered each other that day in those circumstances,” the AI avatar of Christopher Pelkey said. “In another life we probably could have been friends. I believe in forgiveness and a God who forgives. I still do.”

It was the first time the AI avatar of a victim—in this case, a dead man—has ever addressed a court, and it raises many questions about the use of this type of technology in future court proceedings.

The avatar was made by Pelkey’s sister, Stacey Wales. Wales tells 404 Media that her husband, Pelkey’s brother-in-law, recoiled when she told him about the idea. “He told me, ‘Stacey, you’re asking a lot.’”

  • Echo Dot@feddit.uk
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    20
    ·
    16 hours ago

    Fuck everything about this, this should be prohibited

    Why? Who exactly is being harmed by this? The dead guy, certainly isn’t. It’s no different than a statement from a family member. The method of delivery does not make a difference to the material content. You’re acting as if it’s putting words in the mouth of someone who’s died but everyone intellectually knows that the AI isn’t contacting the dead.

    You would have a hard time arguing that someone could be confused into believing that this was actually the opinion of the deceased.

    • kiagam@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      16 hours ago

      You have a lot of faith in people’s logic level. Most people read at 6th grade level. There is a person saying “I think this”, do you really think everyone in there thought “I’m completely unaware of what the deceased thought”