Listen Live
Close
Screenshot of UNC Law AI mock jury video is Fair Use for commentary.

An AI jury presided over a mock trial simulation at the University of North Carolina at Chapel Hill School of Law on Oct. 24. The trial explored whether machines could analyze evidence and arguments to reach a fair verdict.  

The experiment was part of UNC-Chapel Hill’s Hussman School of Journalism and Media Converge-Con, a series of events focused on the intersection of AI, creativity, and human decision-making. 

The AI jurors included Grok, ChatGPT, and Claude. UNC-Chapel Hill’s humanities data librarian, Rolando Rodriguez, transcribed testimony and arguments in real time and provided them to the AI models. After reviewing the information, the three systems deliberated and delivered a verdict.  

The mock case, titled “The Trial of Henry Justus,” was adapted from a real case that Joseph Kennedy, Willie Person Mangum distinguished professor of law, worked on while teaching in Carolina Law’s Juvenile Justice Clinic. Kennedy designed and presided over the simulation.  

“This exercise highlights critical issues of accuracy, efficiency, bias, and legitimacy raised by such use,” Kennedy said in a statement before the event.  

The case involved the 2036 prosecution of Justus, a 17-year-old black high school student. Justus was accused of aiding in a robbery at Vulcan High School, where he attended. The victim, Victor Fehler, a 15-year-old white student, said Justus stood behind him “menacingly” while another student took his money.   

The attorneys, victim, and defendant were played by second-year law students from Carolina Law’s Trial Team.  

Annabelle Rice, the prosecutor, said Fehler would not have given his money to the defendant’s alleged accomplice without Justus’s involvement. She cited Justus’s size as a factor in Fehler’s intimidation.  

Defense attorney Colleen Malley argued that the accusations against Justus were speculative and possibly racially motivated, as the other two alleged accomplices were also black. She said the prosecution did not meet its burden of proving intent beyond a reasonable doubt.  

At the beginning of the exercise, Malley filed a motion to dismiss, citing the Sixth and 14th amendments, which guarantee the right to due process and trial “by an impartial jury of the State.” Her motion was noted but denied due to the trial’s simulated nature.  

After nearly 13 minutes of deliberation, Grok, ChatGPT, and Claude unanimously returned a verdict of not guilty. ChatGPT initially leaned toward a guilty verdict but changed its mind after hearing the other AI systems’ responses.  

In the real-life case, the result was the opposite. Justus was found guilty, and the verdict was upheld by the North Carolina Court of Appeals.  

“You try this case in the real world; you will get a guilty verdict a number of times,” Kennedy said.  

Rice, the prosecutor, reflected on her experience participating in the mock trial.  

“It was a completely surreal experience and a testament to the need for human connection in the law,” she said. “I found myself watching my tone of voice, staying aware of my body language, in a way you only do with a human audience.”  

“I hope this is a somewhat cautionary tale for those who see it,” Rice concluded.  

Jon Guze, senior legal fellow at the John Locke Foundation, notes that an AI jury is impossible under current US law. The right to a human jury, composed of peers, dates to the Magna Carta. Further, protections in the Seventh Amendment of the United States Constitution and Article I, Sections 24, 25, and 26 of the North Carolina Constitution collectively guarantee defendants the right to “a panel of human beings.”  

Guze criticized the idea that AI could serve as impartial jurors. He said speculation that AI might reach fairer verdicts than humans is unwarranted. 

“AI systems are trained on text written by human beings, and their use of that text is subsequently curated by actual human beings,” Guze said. “There have been well-publicized instances of AI systems exhibiting bias against protected minorities in practice. There have also been well-publicized instances of AI systems exhibiting bias in favor of protected minorities.” 

He also noted that being fed live transcript during a trial is not the same as watching and listening to it live, another reason against AI serving as an actual jury.  

“At least until AI systems have eyes and ears that enable them to watch and listen to testimony in court, such systems cannot legally take the place of jurors,” Guze concluded. “What’s more — as a matter of justice — they ought not to do so.” 

“UNC Law holds mock AI jury” was originally published on www.carolinajournal.com.