Arbitration Experts Test Generative AI Tools Amid Skepticism

This article has been saved to your Favorites!
Generative artificial intelligence's rise has gradually progressed within the legal sector in the past year, extending its potential influence to arbitration and dispute resolution.

Challenges exist to the wider adoption of the technology among arbitrators, but some are forging ahead by pushing the envelope with experimental use cases.

Wanting to test the limits of generative AI is what Bridget McCormack had in mind when the former chief justice on the Michigan Supreme Court took over as president and chief executive of the nonprofit American Arbitration Association, or AAA, in 2023.

The AAA provides alternative dispute resolution, or ADR, services for parties in commercial disputes.

McCormack, also a special adviser to the American Bar Association's AI Task Force, empowered the AAA's staff to experiment with the technology last year. The AAA issued licenses to its team members for large language models, or LLMs, to discover new ways to make arbitration and dispute resolution faster and more efficient.

"Generative AI is going to disrupt both the business of law and the practice of law, and the anti-democratization of law, pretty substantially over the next few years," McCormack told Law360 Pulse. "For providers of services to the legal profession, I think we're in for lots of change."

McCormack asked the AAA's staff in 2023 to submit ideas for ways the organization can harness the technology for use in ADR now. After vetting those ideas, the AAA developed a trio of AI-based ADR products that were trained on the organization's dispute resolution data.

The AAA released its first generative AI tool for public use in late 2023. That tool can be used for creating schedules of preliminary hearings in arbitration.

Another tool, for building clauses, is still in beta testing. The AAA is also working on a third tool: a chatbot for self-filers, which is intended to help those navigating their own arbitration and mediation matters.

As a result of these experiments, the AAA in December launched the AAAi Lab, a web center for AAA users, arbitrators, in-house counsel and law firms with information and tools for using generative AI in ADR. The lab features the AAA's guidance on AI in ADR.

"The lab was a way to make sure we captured all the work we were doing and shared it with the world and shared it with the ADR community, so they could give us other ideas, contribute [and] help us think about all the ways in which this new technology is going to make dispute resolution better," McCormack said.

David L. Evans, an attorney with Murphy & King PC and a certified neutral arbitrator and mediator, helped with the launch of the AAAi Lab by contributing to projects. He is also a former member of the AAA's board of directors.

While generative AI is seldom used in ADR now, Evans foresees that the technology will become more common in the future.

"The use cases in arbitration, I think, are starting to develop and they'll be primarily at the beginning oriented towards efficiencies in the process," Evans said.

Those use cases may involve assisting lawyers by generating documents, summarizing documents and researching through volumes of information, according to Evans.

Eventually, Evans said, generative AI could be integrated into the decision-making process. For example, a scenario could involve an AI model resolving disputes between parties by itself or in a hybrid role alongside human lawyers.

Challenges

McCormack said the legal community reacted positively to the launch of the AAAi Lab in December and expects the use of generative AI in ADR to grow in the future.

But skeptics remain.

Challenges are obstructing the adoption of generative AI in ADR. Experts say those challenges are similar throughout the larger legal profession. This includes some lawyers being hesitant to accept new technologies.

Another issue is hallucinations, which is when a generative AI tool produces a response with false information.

McCormack said that the AAA caught a hallucination in one of the legal generative AI tools that it was testing last year. That is why the AAA installed guardrails during its generative AI experiments, such as double-checking the output and not uploading case information that might jeopardize data privacy rules.

Experts recommend that users check the output of generative AI for any falsehoods, but McCormack said that might change in the future as the technology evolves.

"The things that we might worry about today might not be things we have to worry about in six months or 12 months," McCormack said.

The ADR industry is already taking steps to address hallucinations.

In August, the Silicon Valley Arbitration and Mediation Center published proposed guidelines for the use of generative AI in arbitration. This list of best practices included responsibilities for all parties involved in ADR.

For example, one guideline said that all participants should understand and be able to explain how a generative AI model arrived at its outputs, which would ensure that the parties are on the lookout for hallucinations.

Another issue is potential bias in the data that is used to train LLMs. However, there is a bright spot to this challenge.

"You can de-bias a dataset a lot more easily than you can de-bias a human," McCormack said.

Concerns in the ADR community also exist about AI running amok, according to Evans, referring to fears that others have that AI is going to take over the world or do a poor job at tasks such as research.

"I totally understand that concern, but I think the benefits of the technology grossly outweigh the potential anxieties that are associated with it," Evans said.

Those benefits include opening up the legal process to more people and making ADR more efficient for legal professionals.

Evans also said that AI could make the process of dispute resolution easier for neutral arbitrators and mediators, which would make the technology more common in ADR.

"Neutrals that use AI are going to replace neutrals that don't use AI," Evans said.

--Additional reporting by Matt Perez. Editing by Robert Rudinger.


For a reprint of this article, please contact reprints@law360.com.

×

Law360

Law360 Law360 UK Law360 Tax Authority Law360 Employment Authority Law360 Insurance Authority Law360 Real Estate Authority Law360 Healthcare Authority Law360 Bankruptcy Authority

Rankings

NEWLeaderboard Analytics Social Impact Leaders Prestige Leaders Pulse Leaderboard Women in Law Report Law360 400 Diversity Snapshot Rising Stars Summer Associates

National Sections

Modern Lawyer Courts Daily Litigation In-House Mid-Law Legal Tech Small Law Insights

Regional Sections

California Pulse Connecticut Pulse DC Pulse Delaware Pulse Florida Pulse Georgia Pulse New Jersey Pulse New York Pulse Pennsylvania Pulse Texas Pulse

Site Menu

Subscribe Advanced Search About Contact