Even If Technology Gets Every Call Right—Something Still Feels Wrong

This isn't just about tennis.

Reports from Wimbledon this past week revealed a historic shift that offers a warning for anyone concerned about the role of technology. For the first time in 148 years, human line judges were entirely replaced by an AI system that called every "out." Gone were the trained eyes, the decisive arm gestures, and the human presence that have defined the game for over a century. In their place was algorithmic precision delivering digital certainty.

While the technology promised flawless accuracy, reports of players questioning unchallengeable calls, staring at the empty spaces where line judges once stood, revealed a deeper truth about decision-making. It's not just about being right. It's also about the process and the human element, which gives decisions their weight and legitimacy.

The Human Element We Can't Automate

When a human official makes a controversial call, players and fans have someone to respond to, to argue with, to hold accountable. This moment of disagreement, even frustration, isn't just part of the experience. It gives the game its structure and rhythm, and makes it feel fair even when the call goes against a player.

With a fully automated system, that dynamic vanishes. Players who feel robbed by an "out" call can't stare down a computer or plead their case. Reports indicated that some players expressed open dissatisfaction, with some convinced the calls were flat out wrong. One player highlighted a moment where the system showed the ball in, but he was certain it was out. Even players who generally favored automation conceded the difficulty when their own eyes contradicted the final, unchallengeable call.

There's no satisfaction in arguing with an invisible system. No release, just lingering doubt about the process itself.

The match's rhythm was also disrupted according to some player accounts. That subtle timing (the pause after a close call, the emotional momentum of a challenge, the mental reset between points) was gone. While the game may have flowed faster, it felt profoundly different to some. This rhythm, often overlooked, is crucial. It helps players settle, gives the audience a shared sense of anticipation, and makes the game feel alive and fair.

What This Means for Justice

If some people don't trust a machine to call a tennis ball fairly, why would they trust one to make decisions about custody, sentencing, or liberty?

This isn't hypothetical. AI has already entered our courtrooms, drafting orders and suggesting outcomes. These systems create shortcuts that may begin bypassing a crucial step in the justice system. A human being who will take responsibility for the decision and explain the reasoning behind it.

The justice system isn't merely another industry to be optimized. It's the foundation of a free society. Every day, courts make deeply human decisions. Who raises children, who goes to jail, who gets a second chance. These decisions demand empathy, experience, and context. But just as importantly, they require a process that feels fair and legitimate to those affected.

Judges deciding custody aren't just applying standards. They're determining whether a child sees one parent daily or twice a month. And no parent should ever wonder if they lost custody because a court used the wrong algorithm.

A sentencing decision isn't solely about guidelines. It's about justice for victims, the potential for rehabilitation, and community safety. And the people involved need to feel heard, understood, and know that a human being weighed their circumstances before making a decision.

This is precisely why the notion that AI should bear the heavy lifting in judicial work isn't just wrong. It's dangerous. Machines can analyze data and identify trends, but they cannot grasp moral complexity, feel the consequences of their decisions, or provide the sense of being heard that legitimizes the process.

When we allow technology to shape the initial draft of justice, we risk the tool leading the analysis, rather than the other way around. More importantly, we risk losing the human process that makes justice feel legitimate.

Getting the Balance Right

This doesn't mean technology has no place in the justice system. AI can certainly streamline routine tasks, help identify patterns in case law, and organize information. However, we must be absolutely clear about where its role ends. Decision making and accountability must rest with a person who can explain, defend, and bear responsibility for the decision.

The Warning from Centre Court

What is unfolding at Wimbledon offers an important lesson. When we remove humans from consequential decisions, we may gain efficiency and accuracy, but we lose something harder to quantify yet equally important. The sense of fairness.

In the justice system, we cannot afford to make this trade. People need to feel that they were heard and that justice was done, not just that the right outcome was reached.

This is something no algorithm can offer, and something we should never automate away.

Subscribe to my Substack newsletter today so you don’t miss out on a post. https://judgeschlegel.substack.com

Next
Next

When AI Gets an Email: What Digital Workers in Banking Mean for Courts and Law Firms