- Two apologetic lawyers are facing potential sanctions after a court filing that included fake cases generated by ChatGPT
- The lawyers, identified as attorneys Steven A. Schwartz and Peter LoDuca, said that they were tricked into including fictitious legal research in the filing
- The lawsuit was against the Colombian airline Avianca about an injury incurred on a 2019 flight
Lawyers who used ChatGPT blame the artificial intelligence program for tricking them into including fictitious legal research in a court filing.
The individuals were identified as attorneys Steven A. Schwartz and Peter LoDuca, now facing potential sanctions. This comes after a lawsuit against an airline that included references to previous court cases that the attorneys believed were real but were later found to be fakes generated by the AI chatbot.
Lawyer Cited Fake Legal Research After Using ChatGPT
Schwartz tried to explain himself by using artificial intelligence because he was looking for legal precedents to support his client's case. It was against the Colombian airline Avianca which is accused of being responsible for an injury incurred on a 2019 flight, as per ABC News.
ChatGPT has recently become a popular technology for generating essay-like answers based on users' prompts. The artificial intelligence chatbot suggested several cases to Schwartz and his partner involving aviation mishaps that the lawyer needed help finding through the usual methods that his law firm used.
However, the drawback to the attorneys using the artificial intelligence program is that the cases that it cited did not exist or involved airlines that did not exist. Schwartz told Judge P. Kevin Castel, overseeing the lawsuit, that he was operating under a misconception.
This was because the website was obtaining the cases from a source he could not access. The lawyer added that he "failed miserably" at doing follow-up research to ensure the legal citations were accurate and factual.
Castel pressed Schwartz regarding how he missed the fake citations and argued that the chatbot's fabricated ramblings were considered "legal gibberish." He added that ChatGPT was not simply supplementing the attorney's research, but rather, it was his research, according to Business Insider.
Trying to Bolster His Client's Case
The case that Schwartz and LoDuca were working on involved a man who alleged that he was injured in an airplane by a serving cart. The court filing included six court cases later found to be "bogus judicial decisions" with fake quotes and internal citations.
In a statement, Schwartz profusely apologized for his actions, expressing his sincere apologies to the court, defendants, and his firm. He said he deeply regrets what he did that resulted in the hearing.
The lawyer added that he has suffered professionally and personally because of the widespread publicity that the issue has resulted in, adding that he was "embarrassed, humiliated, and extremely remorseful."
Schwartz said that in his 30 years of working in the legal field, he has never been involved in anything like the recent controversy. He assured the court that nothing like it will ever happen in the future said the New York Post.