New York Lawyer Used ChatGPT To Write Legal Brief

A New York attorney is in legal hot water after utilizing OpenAI’s ChatGPT to compose a brief that he submitted to court. Steven Schwartz’s alleged wrongdoing was discovered after the chatbot inserted numerous citations to fictional sources in the document.

This apparently common occurrence among chatbots is called “hallucinating.” Much like people, AI has the tendency to make it up as it goes, though Schwartz was unaware of this limitation.

The lawyer protested in an affidavit that he was clueless that the AI’s output might not be entirely factual.

Schwartz practices law at Levidow, Levidow and Oberman. He apparently used AI’s assistance with a lawsuit against Colombian airline Avianca, and his filing was riddled with citations from sources that are not real.

The case involved a man who claimed to have suffered an injury on a flight to New York City. Schwartz filed a 10-page brief requesting a continuation in the case after the airline asked that the filing be dismissed.

The list of cases cited by the filing was impressive. It included Miller v. United Airlines, Martinez v. Delta Airlines, and Varghese v. China Southern Airlines.

Only, none of these courtroom dramas ever happened.

Schwartz filed an affidavit Thursday explaining that he utilized ChatGPT to “supplement” his research in composing the brief. He included screenshots of his asking ChatGPT if the cases were indeed real and the affirmative responses.

The AI defended itself, saying it used “reputable legal databases” that included LexisNexis and Westlaw.

Schwartz sheepishly declared his regret for resorting to AI and promised to “never do so in the future without absolute verification of its authenticity.” Of course, work is undoubtedly being done at this moment to fix this glitch in artificial intelligence.

Whether Schwartz will receive any more chances to rectify his mistake is anyone’s guess at this point. The judge in the case called for a June 8 hearing to discuss possible sanctions against the attorney for his actions.

While many in widely divergent walks of life now claim to be implementing AI for the greater good, the mad rush might do well to pull back on the so-called “progress.” The legal profession just received a great example of the current limitations of the shiniest new bauble to hit the market.