ChatGPT -- Are The Robots Finally Here?

The upcoming year looks to be a fascinating one for legal technology.

There has been much chatter about robots coming to take away jobs — including the jobs of lawyers. Last year, I wrote the article The Robots Are(n’t) Coming, in which I argued that technologies will evolve and help attorneys be better at their jobs. In the past few weeks, we may have taken a giant leap forward in the evolutionary path.

On November 30, 2022, OpenAI launched ChatGPT, a massively scaled interactive chat robot that uses artificial intelligence to answer questions on anything. Within five days, OpenAI claimed over 1 million users of ChatGPT. While early, the viral expansion of the technology (which is still a prototype) is staggering. What is also staggering is how articulate, detailed, and accurate the technology is in answering most questions.

ChatGPT is already being used for a number of different things. College students are submitting papers based on it. It can explain Einstein’s theory of relativity as a poem. It can debug software. And the technology can create what would appear to be a well-written contract, help write a law review article in an hour, or even write a brief to argue before the Supreme Court. I’ve played with the technology myself, asking legal questions like “Can you provide me with a contract clause for limiting liability for consequential damages if a business transaction is not completed?” It provided a thorough and convincing clause with explanations that looked pretty good to this nonattorney.

OpenAI’s Funding

The company OpenAI is structured as a nonprofit but with a for-profit subsidiary. It is believed to be backed by more than $1 billion from a who’s who of technology. The OpenAI web site mentions Microsoft, Reid Hoffman’s charitable foundation, and Khosla Ventures. Other investors referenced elsewhere include Andreessen Horowitz, Sam Altman, Y Combinator, and Elon Musk.

In addition to funding, ChatGPT has been fed and trained with information collected from across the Internet and possibly more. Many privacy policies — like Microsoft’s, for example — include statements that allow anonymized personal data to be used for AI training purposes. I asked ChatGPT if it used anonymized personal data from Microsoft. ChatGPT gave me a vague non-answer, which suggests its AI training included input from legal counsel.

Sponsored

Why Is Chat GPT Such A Potential Gamechanger For Lawyers?

First, the technology is scary good and operates at massive scale across many industries and applications. It learns. It admits mistakes. It writes extremely well in English.

By observation, ChatGPT’s AI training likely includes teaching it how to provide context and disclaimers that minimize legal issues. There are some safeguards to minimize harmful instructions. The system is disconnected from the internet and creates answers (output) that are original so it does not infringe intellectual property.

It’s evident that ChatGPT has been trained with all types of legal documents including SEC filings, contracts, and other legal information and commentary available publicly. These qualities suggest that it could be a highly useful tool for lawyers to have in their arsenal for input into legal issues.

What Can It Do To Help Lawyers Now?

Sponsored

ChatGPT can be a great starting point to ask questions, formulate hypotheses, find a clause, or test an argument. I would encourage anyone reading this article to register and try it out. The main caution is that like legal research platforms, publications, or legal software, ChatGPT is just a tool or service to be used by attorneys. Attorneys must recognize the strengths and weakness of ChatGPT just like they do with any other tool. The review of work and legal advice rests squarely on the attorney.

What Are Some Limitations Or Dangers For Lawyers To Be Aware Of?

ChatGPT is still a prototype application. It will get better, and it will invite competition. For example, I see nothing stopping Amazon, Apple, Google, or Meta from using their extensive scale and access to internet data (and anonymized personal data) to build something similar.

ChatGPT lacks authority

When asking ChatGPT a question, it will provide an answer, but that answer won’t include footnotes or specific references. It may mention a case name, but no citation. Additionally, the AI is a “black box” — it can’t necessarily explain the logic and steps it took to reach a conclusion.

ChatGPT includes occasional bias

All AI is trained with human input. There are numerous examples of how bias has been introduced into algorithms. Drawing conclusions from AI may include implicit bias that could influence an argument, disadvantage a client, or reinforce societal biases. Lensa, another viral AI technology that is best known for creating artistic profile pictures has been criticized of several biases including carrying gender bias into the future. Attorneys should consider potential bias when using ChatGPT.

ChatGPT could be manipulated to create false answers

AI learns. When I asked Chat GPT, “Who invented the Trapper Keeper”? It gave me a wrong answer. (My father invented the Trapper Keeper.) I challenged ChatGPT through a series of questions. ChatGPT apologized, referenced the US Patent & Trademark Office, and changed its response to agree with me.

The point is that a coordinated set of people (or bots) could influence answers in ChatGPT for gain or nefarious purposes. Consider securities fraud — could a group of people try to convince ChatGPT that a vulnerable cryptocurrency exchange was safe and reliable? I wouldn’t be surprised if some unintended uses of the technology surface in the near future.

What Are Some Legal Applications That Might Benefit From ChatGPT In The Coming Years?

ChatGPT is still a prototype. Features will be added, and it is likely that a law firm or company might be able to use its own data or other proprietary data with the technology. Here are three potential nearer term applications.

Knowledge Management. Imagine when ChatGPT can be licensed by a law firm. Finding out how other clients have been advised on an issue by the firm, finding an expert on a topic, or unearthing legal advice could drive efficiencies and help create better work product for clients. The big issue would be how to manage privilege and conflict of interest checks when training the AI.

Contracting. ChatGPT can already create a contract or help come up with a draft of a clause. What if a law firm used ChatGPT to improve and standardize its contract templates? And what if contract lifecycle management (CLM) products leveraged ChatGPT?

Should an attorney use ChatGPT to draft a complex contract? No. But as a tool, it already demonstrates interesting value as a tool.

Legal Research. Right now ChatGPT does not include specific references to the law or perform citation checking. But what if it did? The ability to develop a first draft of a brief or other work product might be very plausible in the future.

The upcoming year looks to be a fascinating one for legal technology. For 2023 the Chinese New Year rings in the year of the Rabbit; for legal technology, 2023 could be the year of AI.


Ken Crutchfield is Vice President and General Manager of Legal Markets at Wolters Kluwer Legal & Regulatory U.S., a leading provider of information, business intelligence, regulatory and legal workflow solutions. Ken has more than three decades of experience as a leader in information and software solutions across industries. He can be reached at ken.crutchfield@wolterskluwer.com.

CRM Banner