Lawyer Calls Evicting People A 'Patriotic Duty,' Apparently Cite-Checking Fake Cases In Briefs Less Of A Patriotic Duty

This sure sounds like another lawyer caught using ChatGPT to make up caselaw.

robot artificial intelligence thinks dreamsIn 2007, attorney Dennis Block told the LA Times that he viewed it as his “patriotic duty” to evict people from rent-controlled properties. Recently a judge reminded Block’s firm of an additional duty worth considering: not filing briefs with fake cases.

In what feels like a redux of the New York “ChatGPT Brief” case, Block signed off on a brief responding to a tenant attorney’s dismissal motion. But when that attorney settled down to read the response, she was in for a surprise:

“I felt a little bit like I was completely misreading or misunderstanding what was happening,” Nicholson said in an interview. “It was one of the weirdest opposition briefs I have ever received.”

While Block’s firm has not explicitly admitted to using an AI bot to write a brief, the presence of a number of completely fake case citations suggests another incidence of consumer-facing AI hallucinating its way through a legal research question.

[Judge Ian] Fusselman began by asking [Block firm attorney John] Greenwood, “What do you have to say for yourself?”

Greenwood replied, “I have to say there was a terrible failure in our office. There’s no excuse for it.”

Greenwood said the attorney at Block’s firm responsible for drafting the filing relied on “online research.” He said “she didn’t check it,” and that she had since left the firm.

Judge Fusselman fined the firm $999, a buck shy of the threshold for reporting the incident to the state bar.

It’s the “didn’t check it” part that matters. It’s eye-catching to blame AI for these mistakes, but the AI didn’t do anything wrong — lawyers still have to read the cases they’re citing. Dumb, lazy lawyering is dumb, lazy lawyering regardless of the tool.

Sponsored

But what’s worrying about this case is that landlord attorneys often match up against pro se renters. Without competent opposing counsel to read the brief and figure out the mistakes, a firm could easily get away with making up the law and kicking people out of their homes. In this case, Judge Fusselman apparently independently identified problems with the brief, but one can envision cases where the judge never drills into the briefing when some pro se is in over their heads.

It was all fun and games when lawyers screwed up against an airline’s legal team, but AI risks exacerbating access to justice gaps when attorneys can put pro se folks at risk of getting bowled over by fake law.

This Prolific LA Eviction Law Firm Was Caught Faking Cases In Court. Did They Misuse AI? [LAist]

Earlier: For The Love Of All That Is Holy, Stop Blaming ChatGPT For This Bad Brief
Stop Calling It A ‘Slap On The Wrist’ Just Because The Media Hyped It Up More Than It Deserved


Sponsored

HeadshotJoe Patrice is a senior editor at Above the Law and co-host of Thinking Like A Lawyer. Feel free to email any tips, questions, or comments. Follow him on Twitter if you’re interested in law, politics, and a healthy dose of college sports news. Joe also serves as a Managing Director at RPN Executive Search.

CRM Banner