Stacey Wales grabbed the lectern, drowning his tears while asking the judge to give the man who shot and killed his brother the maximum possible sentence for homicide.
What appeared later surprised those in the Hall of the Phoenix court last week: a video generated by AI with a similarity of his brother, Christopher Pelkey, told the shooter who was forgiven.
The judge said he loved and appreciated the video, then sentenced the shooter to 10.5 years in prison, the maximum sentence and more than the prosecutors were looking for. A few hours after the hearing on May 1, the defendant’s lawyer filed an appeal notice.
The defense lawyer Jason Lamm will not handle the appeal, but said that a superior court will probably be asked if the judge incorrectly trusted the video generated by the AI when sentencing his client.
The courts throughout the country have been dealing with how to better manage the growing presence of artificial intelligence in the courtroom. Even before the Pelkey family used AI to give a voice for the victim’s impact portion, which is believed to be the first in the United States courts, the Arizona Supreme Court created a committee that investigates AI’s best practices.
In Florida, a judge recently put on a virtual reality headset aimed at showing the point of view of a defendant who said he was acting in self -defense when he stirred a gun loaded in the guests at the wedding. The judge rejected his claim.
And in New York, a man without a lawyer used an avatar generated by AI to discuss his case in a demand through a video. The judges took only seconds to realize that the man who headed to them from the video screen It was not real.
Experts say that the use of AI in court raises legal and ethical concerns, especially if used effectively to influence a judge or jury. And they argue that it could have a disproportionate impact on the marginalized communities that face the prosecution.
“I imagine that this will be a form of disputed evidence, partly because it could be something that covers the holidays that have more resources on the parties than not,” said David Evan Harris, an expert at AI Fake Deep Fakes at the UC Berkeley Business School.
The AI can be very persuasive, said Harris, and academics are studying the intersection of technology and manipulation tactics.
Cynthia Godsoe, law professor at the Law Faculty of Brooklyn and former public defender, said that as this technology continues to exceed the limits of traditional legal practices, the courts will have to face questions that had never had to weigh before: Does this photograph of AI really coincide with the testimony of the witness? Does this video exaggerate the height, weight or color of the skin of the suspect?
“It is definitely a disturbing trend,” he said, “because it could deviate even more in the false evidence that people may not discover is false.”
In the case of Arizona, the victim’s sister told Associated Press that he did consider the “ethics and moral” to write a script and use his brother’s similarity to give him a voice during the sentence hearing.
“It was important for us to address this with ethics and morals and not use it to say things that Chris would not say or believe,” said Stacey Wales.
The victims can give their impact statements in any digital format in Arizona, said the lawyer of victims Jessica Gattuso, who represented the family.
When the video reproduces in the courtroom, Wales said that only her and her husband knew it.
“The goal was to humanize Chris and get to the judge,” Wales said.
After seeing it, the judge of the Superior Court of Maricopa County, Todd Lang, said “loved beauty in what Christopher,” he said in the AI video.
“He also says something about the family,” he said. “Because you told me how angry you were, and you demanded the maximum prayer, and although that is what you wanted, you allowed Chris to speak from his heart while you saw it.”
In the appeal, said the defendant’s lawyer, the judge’s comments could be a factor for the sentence to be revoked.
___
Associated Press Sarah Parvini reporter in Los Angeles, Seal Govindarao in Phoenix and Kate Payne in Tallahassee, Florida, contributed to this report.