Skip to content

Why Law Firms Need the Right AI

Andrew Louder May 1, 2024 1:22:29 PM
 
 
 
 
 

Imagine you run a law firm.

 

You just invested in a new cloud-based legal research tool. This innovation is meant to speed up the tedious job of finding precedents to cite in support of arguments made in court filings. The system seems to work great. That is, until a judge correctly points out one major flaw—the case law cited in your latest filing is made up. No such cases ever existed.

 

This may seem like the plot to a twisted Twilight Zone episode or the type of nightmare that might leave even a seasoned lawyer in a cold sweat. But instead, it’s the reality of using consumer-grade AI software like ChatGPT for legal work.

 

Such inherent dangers were exposed in June. An actual judge blasted two lawyers for using ChatGPT to complete a filing in an aviation injury case. The court determined the filing contained six citations to fictitious cases. According to the judge, one of the ersatz citations has “some traits that are superficially consistent with actual judicial decisions,” but that others were mere “gibberish” or “nonsensical.” His response to the fake citations? A scathing response no lawyer wants to be on the receiving end of.

 

To wit, Judge P. Kevin Castel wrote that the lawyers “abandoned their responsibilities when they submitted non-existent judicial opinions with fake quotes and citations created by the artificial intelligence tool ChatGPT, then continued to stand by the fake opinions after judicial orders called their existence into question.”

 

Think that’s bad? The lawyers’ tone-deaf response makes their predicament even worse. One attorney claimed to the court that he was “unaware of the possibility that its content could be false.” He added that he “greatly regrets having utilized generative artificial intelligence to supplement the legal research performed herein and will never do so in the future without absolute verification of its authenticity.”

 
 
 

Also, incredibly, according to an affidavit related to the matter, one of the lawyers even tested the citation by asking ChatGPT if it was “lying,” to which the AI chatbot swore it was telling the truth! Many professionals who read about these lawyers suffering professional embarrassment—plus, fines—are left wondering, “Just what went wrong?”

 

In short, the problem is these lawyers relied on consumer-grade tech to perform specialized professional work. The service provided to their clients was not much better than Googling case citations, with one major difference—Google search doesn’t “hallucinate.”

 

Unfamiliar with the term in this context? Hallucinating is the industry term for when ChatGPT and similar AI tools simply make things up. As the engineering organization IEEE describes it, when ChatGPT hallucinates, it creates output that is “semantically or syntactically plausible but are in fact incorrect or nonsensical.” The judge’s assessment reflects this reality: The fact that one fake citation was close to real law—while others were simply gibberish.

 

ChatGPT hallucination is a serious issue. In fact, its developer OpenAI is being sued for defamation over fake facts its tool reported about a radio show host. Of course, law firms like yours can’t afford to have fake citations in your legal filings. You may be in danger of opening yourself to lawsuits and other major risks, thanks to bogus info provided by consumer-grade chatbots.

 

And yet, on the other hand, many firms also cannot afford to customize GPT systems fine-tuned for the legal sector. Many are reserved for the largest players in the market. Luckily for small and medium-sized firms, there is a third option: cutting edge AI tools effective for law firms that are both quick and easy to implement—and most important of all, don’t hallucinate.

 
 

My team at Louder Co. has helped law firms source the right AI tool to enhance their operations. One recent client came to us with a need to service his clientele better. His complaint concerned the lengthy time it took to produce legal advice. As a person we’ll call Peter explained, “The people who hire us tend to work at the speed of the internet. But they feel my firm is holding them back. How on earth can we get faster?”

 

Although Peter’s firm may have been tempted to try ChatGPT for speedy answers, Louder Co. knew that professional firms like his must hold their AI to a higher standard. Our suggestion? After learning more about Peter’s firm, we recommended Document Intelligence from Thomson Reuters. The software platform provides fast answers to legal questions and aids lawyers in drafting legal documents, but with one major difference: It isn’t trained by some of the craziest corners of the internet like ChatGPT.

 
 
 

Legal experts trained it.

 

To this end, Document Intelligent boasts that its vast cadre of legal experts spent 15,000 hours training the product to answer legal questions. Now, when a lawyer queries the system, any hallucination risk is practically nil. That’s because the info it collates comes from legal experts, not random people. Happily, implementation at Peter’s firm was smooth, the impact immediate. He says, “Client questions that once took days, if not weeks, now require minutes or hours. Our clients no longer feel like we’re holding them back.”

 

Interested in producing similar successes at your firm? Contact Louder Co. today to learn how we can empower your business with all the upsides of AI—yet without the pitfalls of tools not designed with your operations in mind.

 
 
 
 
 

Leave a Comment