AI Model of Victim ‘Forgives’ Killer in Court

Should artificial intelligence speak for the dead?

Thats what KNXV-TV reported happened last month during the trial of Gabriel Horcasitas who was convicted of manslaughter for a road rage incident in Chandler, Arizona where he shot and killed Chris Pelkey in 2021.

Stacey Wales – Pelkey’s sister – used AI to give a victim impact statement in court. Her husband Tim and their friend Scott Yentzer had been working in AI for years. Wales said the process was not easy.

“There’s no tool out there that you can just go and say, here’s a voice file. Here’s a picture. Please make it come to life. And this is what I wanted to say. So they’re scrounging and using this tool and that tool and this tool and this script and this audio and this image and trying to mash it all together and make a Frankenstein of love,” she said, according to KSAZ-TV.

“To Gabriel Horcasitas, the man who shot me, it is a shame we encountered each other that day in those circumstances. In another life, we probably could have been friends,” the AI avatar said.

The AI told viewers in court that, “I am a version of Chris Pelkey recreated through AI that uses my picture and my voice profile.”

It went onto to thank attendees in court and said it felt, “humbled” by those that spoke on Pelkey’s behalf.

Should AI be allowed in courts?

It even addressed the judge, Todd Lang, thanking him for his persistence in presiding over the case and listening to every impact statement.

Lang, who clearly felt moved by the video, said in court, “I love that AI. Thank you for that. I felt like that was genuine; that his obvious forgiveness of Mr. Horcasitas reflects the character I heard about today.”

The AI video was shown on the day of Horcasitas’ sentencing as Wales thought testimony via letters from Pelkey’s friends and loved ones could not communicate as intimately who her brother was.

“We received 49 letters that the judge was able to read before walking into sentencing that day. But there was one missing piece. There was one voice that was not in those letters,” Wales explained on the decision.

KNXV-TV says there is no other recorded use of AI for a victim impact statement.

Related:

Trump Wins for America Again – 2 Weeks After ‘Liberation Day’ $500 Billion Manufacturing Investment Announced from Foreign Company

After being convicted of manslaughter, Horcasitas was given ten and a half years in prison, slightly higher than the nine and a half years prosecutors asked for.

The one glaring problem here – this is not Pelkey.

Although the video comes off as heartfelt and – by all accounts – a genuine recreation of Pelkey, it is just that, a recreation.

Pelkey had no say in this, having passed that day.

We have no idea what he would say or think after the fact. Although the AI is clear that it is not Pelkey, it proceeds to speak in the first person.

The Associated Press writes this is a legal means of giving a victim impact statement in Arizona outside of being considered evidence.

Horcasitas lawyer, Jason Lamm told the AP he appealed his client’s sentence, seeing the likelihood that an appeals court will consider if the AI influenced how Lang handed down the sentence.

AI now permeates every aspect of our lives from education to art and culture. Its use in our judicial system poses serious risks if Lamm is right.

 

Advertise with The Western Journal and reach millions of highly engaged readers, while supporting our work. Advertise Today.



Source link

Related Posts

No Content Available