This lawyer used ChatGPT for legal research. Now he faces sanctions
KEY POINTS
- A lawyer may be subject to sanctions after using ChatGPT to prepare a lawsuit.
- The chatbot listed the “bogus” court cases that the lawyer included in his resume.
- The judge said he faced an “unprecedented circumstance”.
New York lawyer could be sanctioned after he used for legal research that turned out to be false.
Stephen A. Schwartz, who has been practicing law in the state for more than three decades, was on the legal team of Roberto Mata, the man who sued Avianca over an alleged incident in which a serving cart struck and injured his knee.
Mr. Schwartz, a lawyer for the firm Levidow, Levidow & Oberman, prepared a brief report that was to use precedent to argue why the case should move forward after Avianka’s attorneys asked a federal court judge to dismiss it.
But the summary raised eyebrows among the airline’s legal team, who wrote to the judge that they could not find several of the cases cited.
The judge ordered Mr. Schwartz and one of his colleagues, Peter Loduk, to explain why they should not be punished, stating in the ruling that he was faced with an “unprecedented circumstance”.
“Six of the cases presented appear to be fictitious judgments with fictitious citations and fictitious internal citations,” Judge P. Kevin Castel wrote.
Mr Schwartz wrote that Mr. Loduca’s name was on the paperwork because he was not allowed to practice in federal court, where the lawsuit was filed after it was originally filed in state court. He said that he continues to do all the legal work on the case and that Mr. Loduka did not know he was using ChatGPT to do it.
In my own statement, he “had no reason to doubt” the cited case file or Mr. Schwartz’s research.
Mr. Schwartz wrote that he “did not intend to deceive” the court or Avianca and “greatly regrets” using ChatGPT, which he says he has never used in legal research before.
He wrote that ChatGPT “provided its legitimate source and reassurance of the reliability of its content”, but ultimately “proved to be unreliable”.
Attached to his affidavit are screenshots of what appears to be part of Mr. Schwartz’s conversation with .
ChatGPT is asked if one of the cases provided was real. After he says it is, “S” asks what the source is.
The chatbot responded that “after rechecking” the case was legal and that it could be found in legal research databases. He also said that the other cases he listed were real.
The lawyers were ordered to explain why they should not be sanctioned at a June 8 hearing.