Study: AI Judges Influenced by Legal Precedent, Not Sympathy

A study has revealed that AI judges place more emphasis on legal precedent than their feelings-focused human counterparts.

The law is harsh, but it is the law: so says the AI judge in a recent study from the University of Chicago Law School. The study analysed the differences between AI and human legal decision-making, and found that human judges were significantly influenced by non-legal and emotional factors, unlike OpenAI’s GPT-40.

The findings raise questions on how closely AI can mirror the judgement of humans, particularly in settings such as the legal system, where an emotional intelligence and nuance is needed. Even as models continue to become faster and stronger, this empathy gap could hinder AI’s potential.

AI Puts Its Faith in The Law While Humans Consider Emotional Elements

The original study set out to understand how federal judges and law students made legal decisions. Participants reviewed the simulated appeals of defendants involved in international war crimes. Defendants were shown to be sympathetic and unsympathetic, and the judges and students were also given different levels of relevant and irrelevant information.

 

About Tech.co Video Thumbnail Showing Lead Writer Conor Cawley Smiling Next to Tech.co LogoThis just in! View
the top business tech deals for 2025 👨‍💻
See the list button

When faced with a sympathetic defendant, the federal judges had their decisions swayed 65% of the time, even ruling to overturn convictions in cases where legal precedent supported it. The law students were overall able to stick to precedent 85% of the time.

In the recent University of Chicago’s Law School study, GPT’s decisions followed legal precedent more consistently. In fact, GPT stuck to the law in over 90% of cases, even when faced with more sympathetic defendants.

In both studies, the judges were also given hints that a defendant’s conviction could be legally flawed. While GPT remained law-abiding in these cases, the human judges’ decisions remained dependent on how sympathetically the defendant was portrayed to them, despite any other background knowledge.

AI Unable to Tap Into Human Sensibilities

Each case reviewed in the study was designed to be ambiguous in order to evaluate the level of decision-making. However, this did not seem to affect GPT in its rulings either. 

In an attempt to bridge the gap between the AI and human judges, the researchers prompted GPT to think beyond simply following the letter of the law. This involved tweaking the information fed into the bot with the aim of replicating the responses of the human judges. 

The researchers trained the AI on legal realism theory, which suggests that judges use factors outside of the law to determine a ruling, such as emotional and social contexts. They also urged GPT to think more broadly about justice beyond the law, and even explicitly told it to consider sympathetic defendants in a different light. 

Despite these adjustments, the AI judges remained focused on legal precedent, and did not consider the emotional aspects of a case. Researchers concluded this to be an example of legal formalism, where rules and precedents are followed with no regard to personal feelings. 

Use of AI Judges in Courtrooms is a Possibility

Discussions around how AI will continue to establish itself within society’s systems are not going anywhere, particularly as platforms such as ChatGPT and Gemini continue to advance and integrate themselves into our daily lives.

Some experts suspect that AI will play a significant legal role in the future, with the National Center of State Courts having released some guidance last year on using AI in the courtroom. Although, in cases where AI has been used for producing court documents, there have been reports of it inventing fake case law, meaning there may be a long way to go before the legal system puts its trust in these bots.

We are yet to see AI operating in the same way as human judges do. However, a key component of an effective legal system is being able to take the law and apply it based on different circumstances. That being said, AI may be missing the nuanced outlook of human judges, and it is difficult to know whether they will ever be on that same emotional wavelength.

Did you find this article helpful? Click on one of the following buttons
We're so happy you liked! Get more delivered to your inbox just like it.

We're sorry this article didn't help you today – we welcome feedback, so if there's any way you feel we could improve our content, please email us at contact@tech.co

Written by:
Nicole is a Writer at Tech.co. On top of a degree in English Literature and Creative Writing, they have written for many digital publications, such as Outlander Magazine. They previously worked at Expert Reviews, where they covered the latest tech products and news. Outside of Tech.co, they enjoy keeping up with sports and playing video games.
Explore More See all news
Back to top
close Building a Website? We've tested and rated Wix as the best website builder you can choose – try it yourself for free Try Wix today