facebook-pixel

A Utah lawyer was punished for filing a brief with ‘fake precedent’ made up by artificial intelligence

The Utah Court of Appeals sanctioned the lawyer for citing a case that an AI program ChatGPT invented.

(Trent Nelson | The Salt Lake Tribune) A photo from inside the Matheson Courthouse in Salt Lake City on Friday, Oct. 20, 2023.

In a Utah first, a lawyer was punished last week by the state Court of Appeals for filing a brief with “fake precedent” created by artificial intelligence.

The use of the AI was discovered when one of the cases cited in the argument did not exist in court records and several other cases referenced dealt with unrelated topics and were described incorrectly due to ChatGPT’s “hallucinations” — inaccurate information invented by AI chatbots.

The lawyer who submitted the filing to the appeals court, Richard Bednar, acknowledged to the court that AI was used to prepare the brief. Bednar said it was done by a law clerk without his knowledge and he did not fact-check the document before filing it with the court.

The case at issue, ongoing for more than five years, involves a dispute between a former executive in a Layton company who was demanding payment for shares in the company after what he contended was an improper firing.

Bednar had sought to appeal a decision by the lower court to the court of appeals. The petition asking the court to hear the appeal cited a 2007 case, Royer v. Nelson, as legal precedent to support the argument that the court should intervene.

The case of Royer v. Nelson does not exist.

The opposing counsel, in a reply brief, noted that the only way they were able to locate any mention of such a case was by asking ChatGPT. When they followed up by asking ChatGPT if it was a real case, the AI program apologized, saying it must have been mistaken and could not find any record of the case anywhere.

There were four other cases that opposing counsel also flagged, noting that the cases cited were not remotely related to the matter at hand and that the description of the cases in the petition for appeal that Bednar filed with the court was not accurate.

The appeals court subsequently rejected Bednar’s petition for review.

Bednar’s attorney, Matthew Barneck, said Friday that, while the research was done by a clerk, “you can’t point fingers and maintain credibility with the court,” so Bednar took responsibility for failing to review the cases.

“That was his mistake,” Barneck said. “He owned up to it and authorized me to say that and fell on the sword.”

In an unsigned opinion by the Court of Appeals, the judges seemed to agree with that sentiment.

“We agree that the use of AI in the preparation of pleadings is a research tool that will continue to evolve with advances in technology,” the court wrote. “However, we emphasize that every attorney has an ongoing duty to review and ensure the accuracy of their court filings.”

Counsel in this case, the court wrote, “fell short of their gatekeeping responsibilities as members of the Utah State Bar when they submitted a petition that contained fake precedent generated by ChatGPT.”

“This court takes the submission of fake precedent seriously and finds that sanctions are warranted,” the court wrote.

Bednar was ordered to pay the attorney fees incurred by the opposing party in the case responding to the petition and participating in the inquiry into the matter. He was also directed to refund any fees charged to his clients to file the AI-generated motion. And he must make a $1,000 contribution to And Justice For All, a nonprofit that provides free and low-cost legal services to low-income Utahns, victims of domestic violence and marginalized groups.

The court could have imposed additional penalties, but determined that Bednar did not intend to deceive the court.

The Utah State Bar said in a statement that its disciplinary matters are strictly confidential and it cannot comment on whether it is also investigating Bednar for potential misconduct.

However, it said, the Bar’s Office of Professional Conduct “takes seriously any conduct that may compromise the integrity of the judiciary and the legal profession. Additionally, the Utah State Bar is actively engaging with practitioners and ethics experts to provide guidance and continuing legal education on the ethical use of AI in law practice.”

Barneck said he is not aware of any action by the Office of Professional Conduct, and he does not think the court will refer the matter for investigation, which it can do — although it is possible that some other party might.

Prominent Utah defense attorney Greg Skordas said the Court of Appeals was right to impose sanctions and the punishment seemed reasonable, given that it didn’t appear to be malicious. The use of AI, he said, has become common (although Skordas said he has not used it himself).

“Lawyers are using AI all the time,” he said. “[It] is being used far more often than people expect and attorneys are using it more and more to write briefs. But at the end of the day, the attorney absolutely needs to read the citations and the cases and make sure what is being put out by ChatGPT or whatever they’re using is accurate.”

Nationally, it is not the first instance where attorneys have been sanctioned for utilizing artificial intelligence. In New York, a judge in 2023 fined attorneys and ordered them to apologize to a plaintiff and six judges whose names were on non-existent judicial opinions filed with the court to bolster an argument.

And in January, a federal district judge rebuked a professor who is an expert on the use of AI and deception, after he submitted a declaration to the court that was drafted by ChatGPT and cited non-existent academic articles on the topic, the Minnesota Reformer reported.

Bednar was disciplined previously by the Utah State Bar in 2012, suspending his law license for three years based on complaints filed by four members of the military in cases involving discharges when he practiced in Virginia. In each case, Bednar accepted them as clients, was paid a fee and failed to provide the services he had promised.

Correction, 10 a.m. • This story has been corrected to say that one case cited in the brief did not appear in court records.