White paper

Don’t want to become a “ChatGPT lawyer”?

Make sure you’re using professional-grade AI

It’s a now-common scenario: a high school teacher quickly determines that a student's paper is the work of the AI model ChatGPT. Giveaways include stilted language, obvious factual errors, a lack of verifiable references, and jumps to unsupported conclusions. The student gets a failing grade or has to rewrite the paper.

Some lawyers fear that using AI for a legal task will produce work of a similar caliber to this hypothetical student’s paper. They’re afraid to suffer the same fate as the many “ChatGPT lawyers,” whose output was inferior and rife with errors — including hallucinated inaccurate data analyses or case summaries — and contained poorly argued briefs a judge will find baffling.

Based in part on the prevalence of these stories in the media, some managing partners and heads of law practices still think “AI could turn out to be a fad. It will never reach the stage where a legal professional will feel confident employing it for use in litigation, or to fulfill a client’s objectives. So why should we invest in it?”

It's not an ungrounded argument. It’s quite likely that consumer-grade AI will never be up to the task of generating sophisticated legal work that passes muster in the courtroom. That’s why a law firm needs to distinguish between publicly available, no-cost solutions such as ChatGPT and a professional-grade AI solution specifically designed for use by lawyers.

Much already separates these two types of solutions in terms of reliability and accuracy, and the distinctions are becoming even more stark — it’s the difference between Googling to see what a spot on your arm might be and having that spot examined by a dermatologist.

Making an AI commitment

A legal professional who regards AI as a fad is making a mistake. It’s now a matter of when, not if, your organization will adopt it in some form. That’s why choosing the right AI tool for your firm is crucial.

The savings potential for a law firm using AI is vast. According to the Thomson Reuters Future of Professionals Report 2024, early adopters estimate AI will save legal professionals about 12 hours per week within the next five years. That’s 200 hours a year — essentially the equivalent of adding one team member for every 10 employees. The potential return on investment is equally alluring, with an estimated $100,000 in additional billable hours in a year for a U.S. lawyer.

It’s important to stress that not all AI is created equal, and that potential savings and efficiency improvements will greatly depend on the type of AI system your organization chooses. Public AI, such as ChatGPT, is often enough for general consumers — it’s good at tasks like creating a packing list for an overseas trip or compiling a holiday dinner menu. But there’s a good reason Google is free while law firms pay to access a proprietary resource like Westlaw.

When a lawyer needs to use the correct case citation, precisely define terms in a contract, or confidently rebut an opposing counsel’s claim, the lawyer requires a resource built on secure, up-to-date, and fully verified data. ChatGPT isn’t designed for these tasks, but certain legal AI solutions are. It’s time to find which system works best for your organization.

The AI paradigm shift

Timing is everything when it comes to legal practices. Firms that adopt AI-powered solutions quickly stand to reap substantial benefits in efficiency and productivity. In the years to come, they could pull so far ahead of their rivals that the latecomers will have a hard time catching up, losing clients and market share to their more tech-savvy competitors.

For one thing, AI tools automate, heighten, and accelerate a wide array of tasks for attorneys, particularly time-consuming research and drafting tasks. By outsourcing these to AI, attorneys can devote more time to higher-level work, such as strategic planning and business development. Very quickly, AI becomes a force multiplier, giving smaller teams and law firms the power to punch above their weight. For example, a 40-lawyer firm armed with a professional AI system could do equivalent work, if not more, than a firm twice its size that was late to adopt new technologies.

As for top-level law firms, AI will enable them to undertake ever more complex work that, in the past, could have required additional hires and substantial lawyer hours dedicated to a high-level project.

Reducing time-consuming, repetitive grunt work also enables lawyers to achieve a greater work-life balance and reduces the potential of burnout, particularly for junior lawyers. These attorneys often must sacrifice a weekend to pore through case histories and search for specific clauses in a great pile of documents. Now, AI solutions can do much of that work in minutes. In the Thomson Reuters Institute 2024 Generative AI in Professional Services survey, among those respondents who said they were “hopeful” about AI, 31% cited the prospect of efficiency gains; among respondents who said they were “excited,” that figure rose to 41%.

There’s another thing for reluctant lawyers to consider — whatever your feelings are on the matter, your firm’s biggest corporate clients may expect you to adopt AI. Their logic is sound: if they’re using AI to enhance their operations, why isn’t their law firm? At some point, and possibly quite soon, these clients won’t accept traditional hourly billing from their law firm for tasks that they believe could be completed in a fraction of the time with fewer personnel.

“AI is an inevitable solution in my opinion, and resources that are not fit to be used will find themselves at the back of the line,” said one government legal advisor, interviewed for the 2024 Generative AI in Professional Services survey. “There is great risk of loss of performance compared to the economy if we do not use it; it is the same issues as when Google released its famous white search box. And, if you don’t use a search engine, you’re out.”

Professional AI versus consumer AI: A tale of two different worlds

Legal professionals will significantly benefit from using professional-grade AI tools. These systems are built and continually updated by legal experts to meet the specific needs of lawyers. They run on trusted, secure databases, with reasoning that is transparent, sourced, and verified.

Within the category of professional-grade AI solutions, measurable differences exist among platforms. Recently, Vals surveyed leading professional AI services, looking to see how they compared against each other in such areas as data extraction, document Q&A and summarization, redlining, transcript analysis, chronology generation, and Electronic Data Gathering Analysis and Retrieval (EDGAR) Research.

Accuracy: On target with each shot or firing wildly

Consumer AI. Say that a lawyer is searching for “most-favored nation” clauses within a set of trade agreements. If they use a consumer-grade AI solution, odds are the system will return an array of outputs of varying quality, some of which aren’t applicable in a legal context. Furthermore, the system could have difficulty when attempting multiclause extractions.

Professional AI. Professional-grade AI solutions offer greater clarity and reliability in terms of data extraction. In the Vals study, a set of professional AI systems was assessed to see how each identified and extracted information within the same document. Per Vals, “All AI tools were given the additional instruction to include only the extracted text in their responses, not any additional explanation. In all cases, a good response required both the verbatim extraction of specified information from the provided document(s) and the source location for that information.”

The study found that professional-grade AI tools performed this task reasonably well, particularly with short documents and a single specific clause to extract. There were notable differences, however, when the systems were asked to extract multiple clauses for a single question — some systems stopped at one relevant clause without providing a substantial answer.

The most challenging question asked for 40 fields to be extracted across three credit agreements. CoCounsel was the top performer in that task. CoCounsel and one competitor also performed above “lawyer baseline” — the proven performance for an experienced human lawyer doing the same task.

Content authority: Trusted sources or unreliable information

Consumer AI. An AI platform like the public-model ChatGPT is built on a foundation of opaque sources, ranging from public-domain books to material scraped from the internet, some of which may be copyrighted or inaccurate. A lawyer who asks a public AI system for a list of cases relevant to their current task will then have to go over the information they receive with a fine-tooth comb — is it derived from Wikipedia? Out-of-date legal references? News articles from generalist sources?

Professional AI. A professional-grade AI platform draws its content from industry-respected and thoroughly vetted sources, including federal and state legislation and case histories. When integrated with legal research tools such as Westlaw, users gain access to a verifiable and consistently updated database of case law, statutes, and legal opinions. This capability enables professionals to work with confidence, knowing their queries and research are grounded in authoritative content.

Additionally, a system like CoCounsel has a dedicated method for benchmarking any new large-language models (LLMs). Each new LLM is run through a series of public and private legal tests to assess its aptitude for legal review and analysis. CoCounsel’s Trust Team creates further test cases to evaluate output that comes from the LLM’s early integration with existing systems, and, if the results are promising, a team of attorneys performs additional manual reviews.

Expert insights: Thousands of credentialed experts or a faceless crowd

Consumer AI. In consumer generative AI (GenAI) solutions, it can be challenging to determine how the system prioritizes which sources to use in answering a query. There’s a valid concern that the AI will assign the same weight to a trusted information source — such as a news organization like Reuters — as it does to a financial blog written by a layperson that hasn’t received an update since 2015.

Professional AI. Developers of a professional-grade AI model will explain to users how their system works, the processes that the model uses to answer queries and conduct data searches, and all steps the provider takes to refine its AI’s capabilities. A professional-grade AI provider will implement regular corrective measures to prevent data drift — any changes in the statistical characteristics of input data over time that degrade the model’s performance.

Hence, the importance of having a vast number of experts on hand. These experts keep the AI solution abreast of any new regulations, laws, or case interpretations, and help ensure that its output is consistently accurate. An AI system is only as strong as the people whose knowledge it draws upon

Data security and privacy: Fort Knox or Penn Station

According to the 2025 Generative AI in Professional Services Report from Thomson Reuters Institute, approximately 68% of respondents stated that data security was a top barrier to adopting AI. This is another critical distinction between the consumer and professional-level AI products.

Consumer AI. Using a consumer product like ChatGPT to analyze proprietary information means that you’re often taking risks with your data. It can be unclear how these AI programs use prompts and other user data, or whether the system deletes or retains the information for training purposes after a query. If this information is client-sensitive or copyright-protected, consumer AI products typically lack adequate safeguards to protect it.

Also, some public AI providers make vague or even evasive claims about data usage, the ownership of client data, how often the data gets deleted, and which third parties will have access to it. Users may not have the option to exclude their input from being used in training the AI system.

Professional AI. A professional-grade AI provider must reassure users that they’re taking necessary measures to safeguard customer data. For example, CoCounsel bases its cybersecurity program on several industry standards, including the National Institute of Standards and Technology (NIST) Cybersecurity Framework and the Capability Maturity Model Integration (CMMI).

It’s essential that users of professional AI systems retain ownership of all input and output data. The provider should describe how its model uses data for systemic improvements, outline the options for data anonymization, and explain how clients can opt out of the training process.

Further, AI providers must establish a well-defined data deletion process with specific timelines to ensure user privacy and compliance. For example, all data is deleted after 30 days unless otherwise specified. They should also offer transparent disclosure of any third-party data sharing. In the event of a data breach, the platform should have a well-honed incident response plan, including communication protocols and guarantees of timely notifications to all relevant parties.

Training: Professional classroom or DIY

Consumer AI. Learning to use a consumer AI system often involves a do-it-yourself approach. While these systems are generally user-friendly and intuitive, there’s a learning curve in mastering query refinement to receive the best results.

However, it’s becoming an increasingly challenging task to keep up with AI’s rapid pace of adoption and to stay abreast of changes in AI systems. For example, the public version of ChatGPT was unveiled in November 2022 and has already gone through multiple revisions.

More than half of respondents in the recent Generative AI in Professional Services Report said that while they’ve had some AI training, it’s only on an annual basis — or even less frequently.

Professional AI. A professional-grade AI provider should prioritize regular, intensive user training efforts for their system. Ultimately, an AI platform must be intuitive and supportive so lawyers feel confident in their ability to use it effectively. This includes ensuring their queries are accurately articulated and tasks are handled efficiently.

For example, CoCounsel offers pre-recorded training modules meant to address typical user concerns and real-time connections with experts to clarify any aspect of the system.

Sidebar: Getting AI right in the courtroom

You’ve decided to use an AI program to assist in trial preparation. Here are three essential steps to ensure compliance and avoid getting chastised by the court.

Step 1: Check your cites

Verify AI outputs by ensuring citations are accurate and quoted language is from a legitimate source, and double-check any statutes listed. Using AI doesn’t relieve a lawyer of their professional and ethical obligations. Instead, it’s crucial to verify any AI-generated information before submitting it to the court, just as any lawyer must review work done for them by a paralegal or junior associate.

This is why professional-grade AI systems matter. A solution like CoCounsel automatically provides citations and links for all its findings, enabling you to quickly verify the accuracy of the output.

Step 2: Know your judge

Judges and magistrates set the rules for their courtrooms, and each judge has their own opinion about using AI for legal work. As one judge said in the 2024 Generative AI in Professional Services study, “I think AI has the capacity to increase the efficiency of some processes and thereby possibly enhance access to justice, but it has just as much potential to undermine the justice system if it is misused."

Be mindful of judge-specific procedures. Many federal judges will display AI-related orders on their individual pages on the United States Courts website. For example, Judge Michael M. Bayslon of the Eastern District of Pennsylvania links to a standing order that “any attorney for a party, or a pro se party, [which] has used Artificial Intelligence (“AI”) in the preparation of any complaint, answer, motion, brief, or other paper, filed with the Court … MUST, in a clear and plain factual statement, disclose that AI has been used in any way in the preparation of the filing, and CERTIFY, that each and every citation to the law or the record in the paper, has been verified as accurate.”

Step 3: Disclose your AI use

Read each judge’s order closely to ensure you’re in compliance. Some judges may only require litigants to disclose that they’ve used AI, while others will specify that litigants must list the portions of filings drafted by AI. Some judges could require certification that the use of AI did not result in the disclosure of confidential or proprietary information to unauthorized parties.

Lawyers should be ready to provide a detailed explanation of how they used AI in drafting a brief, for example. They should concisely describe the AI model used and how it was specifically employed in this case. For lawyers using CoCounsel, the help center article provides guidance on the language to use. Lawyers could also provide expert witness testimony to bolster the credibility of the AI solution, if they feel that’s appropriate.

It's essential to be as transparent as possible. Concealing how you used AI in a brief can lead to significant issues if the judge or other parties subsequently learn that AI played a more substantial role than initially indicated.

The value of going professional

Your organization will need to draw upon AI as a resource, and sooner than you might think. More importantly, you need a reliable professional-grade model of AI, one that places your attorneys on a different tier from the so-called “ChatGPT lawyers” of the world.

It’s all about finding the right tools to do a complex job. CoCounsel leverages the expertise of Westlaw and Practical Law to provide a robust solution you can rely on.

Request a free demo of CoCounsel today.

CoCounsel: The AI assistant for professionals

Work smarter and faster with the advanced artificial intelligence that’s with you every step of the way