An unprecedented incident in the US brings Artificial Intelligence (AI) to the center of the judicial process – and human ethics.
Chris Pelkey, 37, was killed in a shooting in Arizona three years ago. In early May 2025, he “reappeared” via AI in the courtroom, addressing his killer, Gabriel Horcasitas, during the sentencing phase.
His family used audio, video and photographs to reconstruct the image and voice of Chris, who appeared in a grey baseball cap, uttering words of forgiveness. The message conveyed: "In another life, we could probably be friends."
Technology as a tool of comfort or as a legal precedent?
The intervention was welcomed by Judge Todd Lang, who said he was moved by the initiative, recognizing the power of forgiveness. However, using AI to “talk” to a victim from the grave has not gone unnoticed by lawyers and ethicists.
Former federal judge Paul Grimm stressed that the use of the video did not affect the outcome of the trial — guilt had already been determined by a jury. However, he noted that it opens up a new avenue for how technology can be incorporated into judicial proceedings, especially when juries are not involved.
On the other hand, business ethics professor Derek Leben (Carnegie Mellon University) expressed strong concern:
“Can we always be sure that what the AI “says” reflects the victim’s own wishes?” he wondered.
The dilemma of a new era
The victim's sister, Stacey Wales, argued that the family approached the matter with respect and responsibility. "Just as a hammer can build or destroy, so too does Artificial Intelligence depend on the hands that wield it." he said.
This incident marks a turning point: AI may offer emotional closure to relatives, but at the same time it raises legal and ethical questions.
Are we heading towards an era where the voice of the dead will be systematically "heard" in courtrooms?
And if so, who will define what they say?
Sources and information:
-
BBC News (2025), article source
-
Duke University Law School – Prof. Paul Grimm
-
Carnegie Mellon University – Prof. Derek Leben
-
Arizona State Supreme Court – AI in legal decisions
???? Readers' Poll | Artificial Intelligence and Justice
Do you believe it is ethically acceptable to use Artificial Intelligence to make victims "speak" in legal proceedings after their death?
???? Yes – As long as it is done with the family's consent, it can provide emotional cleansing and justice.
???? No – It is a violation of human dignity and can distort the victim's intentions.
???? It depends – Each case is unique and must be examined with strict ethical criteria.
⬜ I don't know / I have no opinion.
?? Leave your opinion in the comments! We want to hear your voice. at info@greekradiofl.com
From the journalistic team of Greek News and Radio FL
photo Images: Stacey Wales/YouTube (AI image); local_doctor/Adobe Stock – https://www.fastcompany.com/91331139/ai-brought-a-road-rage-victim-back-to-life-in-court-experts-say-it-went-too-far















































