← Horiz Logo

A Tech Adoption Guide for Lawyers

in partnership with Legal Tech Publishing

Courts

John Roberts Once Again Uses Judiciary’s Annual Report To Express His Utmost Contempt For The Public

The Chief's federal judiciary's year-end report may as well have been generated by ChatGPT.

John Roberts Confirmation Hearings Continue For A Third Day

(Photo by Alex Wong/Getty Images)

For Chief Justice Roberts, the Year-End Report on the Federal Judiciary is no longer a serious assessment of the state of the federal courts as much as it’s a taxpayer-funded blog post for him to express his disdain for the American people.

You might suspect that the design of an annual report of the federal judiciary would involve providing the American people with some sense that the Chief Justice of the United States grasps the issues facing the courts and, ideally, has some sort of plan for addressing them. After all, that’s the whole point of any annual report: to provide stakeholders with a sense of the successes and challenges facing an entity. It’s why a corporate 10-K can’t just decline to mention that the CEO is now wanted by Interpol.

While the federal judiciary in 2023 found itself beset by ethical scandals from top to bottom, jurists abandoning any sense of professionalism and decorum, a forum shopping crisis spawned by the lack of reform to the nationwide injunction procedure, and a criminal defendant openly attacking the judicial process and inspiring violent threats against federal judgesJohn Roberts addressed… none of these.

Every year, I use the Year-End Report to speak to a major issue relevant to the whole federal court system.

No, he does not.

Two reports ago, when the judiciary faced a massive recusal scandal and horrifying allegations of workplace harassment, Roberts blew off these concerns with vague hand-waving about having more training webinars and otherwise chided the public for daring to question the courts.

Alas, this would mark the last time he even tried to use the report for its intended function.

The next year, he used the report to tell a historical anecdote about a heroic judge born in 1904 — unintentionally highlighting that it’s hard to find a more recent laudatory example — and simply refusing to acknowledge anything that actually happened in 2022.

So, back to the present report, what’s the Chief’s “major issue relevant to the whole federal court system” for 2023?

As 2023 draws to a close with breathless predictions about the future of Artificial Intelligence, some may wonder whether judges are about to become obsolete.

Are you fucking kidding me?

There may be serious concerns about whether judges are about to become obsolete, but that has less to do with AI than administrations flooding the courts with non-qualified judges and the highest court in the land being stocked with hacks who use their public duty to collect luxury vacations.

Apparently, Roberts saw the report this year as an opportunity to tickle the clickbait impulses of legal reporters eager to spill a few hundred words speculating about AI rather than focusing on the Chief’s silence on the most pressing issues undermining the legitimacy of the third branch.

This is the opening paragraph of the report.

Sometimes, the arrival of new technology can dramatically change work and life for the better. Just one century ago, for example, fewer than half of American homes had electricity. During the New Deal, the federal government set out to “bring the light” to homes across rural America. Representatives recruited farmers to join electricity co-operatives for $5 each. Then came teams of men to clear the brush, sink the poles, and wire homes to the still inert grid.

Oh. I see what you did there. Rather than open with almost any tale of technological advancement, Roberts subtly reminds us of the sort of life-improving public infrastructure project that his Court would strike down with extreme prejudice.

Trolls gonna troll.

Roberts then devotes several pages of the 2023 report to the history of typewriters and personal computers. Gut check time: is this the sort of paragraph that you as a citizen want to read in a 21st century annual report on the most relevant issues facing the judiciary?

The transition to more modern forms of document production began 150 years ago, with the appearance of the Sholes & Glidden Type Writer, first manufactured in 1873 and famous shortly thereafter as the Remington.
Most judges still wrote their drafts by hand, but the typewriter became an important tool in the dissemination of judicial opinions both internally and to the outside world. In 1905, Justice David Brewer somewhat ungenerously referred to his law clerk as “a typewriter, a fountain pen, used by the judge to facilitate his work.” Until the invention of the Dictaphone, law clerks of this vintage also had to take dictation, and at least one otherwise well qualified law clerk lost his job due to “lack of stenographic knowledge.”

This man is deeply unserious.

Before Roberts released the report, Gabe Roth of Fix the Court sent around a pre-buttal outlining the organization’s proposals for dealing with judicial ethics because OBVIOUSLY the report would focus on ethics after the bombshells of 2023. Roth didn’t count on Roberts staring at his duty to address legal ethics and channeling one those scriveners of old by declaring, “I would prefer not to.”

While obviously of secondary importance, the report is also a piss-poor analysis of artificial intelligence. Roberts exhibits the sort of mealy-mouthed and non-committal analysis of the subject that might’ve been excusable in an article about legal AI from 2014 but not 2024.

Proponents of AI tout its potential to increase access to justice, particularly for litigants with limited resources. Our court system has a monopoly on many forms of relief. If you want a discharge in bankruptcy, for example, you must see a federal judge. For those who cannot afford a lawyer, AI can help. It drives new, highly accessible tools that provide answers to basic questions, including where to find templates and court forms, how to fill them out, and where to bring them for presentation to the judge—all without leaving home. These tools have the welcome potential to smooth out any mismatch between available resources and urgent needs in our court system.

But any use of AI requires caution and humility. One of AI’s prominent applications made headlines this year for a shortcoming known as “hallucination,” which caused the lawyers using the application to submit briefs with citations to non-existent cases. (Always a bad idea.) Some legal scholars have raised concerns about whether entering confidential information into an AI tool might compromise later attempts to invoke legal privileges. In criminal cases, the use of AI in assessing flight risk, recidivism, and other largely discretionary decisions that involve predictions has generated concerns about due process, reliability, and potential bias. At least at present, studies show a persistent public perception of a “human-AI fairness gap,” reflecting the view that human adjudications, for all of their flaws, are fairer than whatever the machine spits out.

If AI had a sense of shame, ChatGPT would be embarrassed by this level of superficiality.

If Roberts had a sense of shame, he should be too.

Earlier: Chief Justice Wants You To Know He Has The Utmost Contempt For You
Chief Justice’s Annual Report Recounts 65-Year-Old Tale Of Judicial Heroism To Remind You There Isn’t Any Today


HeadshotJoe Patrice is a senior editor at Above the Law and co-host of Thinking Like A Lawyer. Feel free to email any tips, questions, or comments. Follow him on Twitter if you’re interested in law, politics, and a healthy dose of college sports news. Joe also serves as a Managing Director at RPN Executive Search.