In a legal first, a Missouri appeals court has hit a litigant with a hefty $10,000 sanction for using artificial intelligence to generate almost two dozen completely fake citations in a court filing. It’s a cautionary tale about the potential misuse of AI that could prompt new court rules.
The case involved Jonathan Karlen, an O’Fallon man appealing a $311,000 judgment he was ordered to pay a former employee for unpaid wages and attorney fees. In a blatant attempt to deceive the court, Karlen’s appeal brief cited 24 legal cases and authorities – except 22 of them were totally made up, “AI hallucinations” as the court put it.
The ruse began to unravel when Kruse’s attorneys quickly realized something was off. As one lawyer said, “He was making claims in his argument that just didn’t comport with the law.” Red flags went up that the cited cases couldn’t possibly be saying what Karlen claimed they did.
It turns out Karlen had hired a legal consultant in California promising low-cost help, not realizing the person had simply used AI to auto-generate the citations and content of the brief. Karlen later confessed and apologized, saying it “was absolutely not my intention to mislead” – but the appellate judge wasn’t having it.
In a scathing rebuke, the judge called Karlen’s submission of “bogus citations” for any reason a “flagrant violation” of his duty of candor to the court that simply could not be tolerated. Legal experts say it’s the first sanctions order of its kind in Missouri related to fictitious AI-generated legal citations, which are apparently becoming an issue in other states as well.
According to one former judge, courts may now need to “enact rules that require lawyers and pro se litigants to verify all citations and to identify any portions of their submissions that are generated by artificial intelligence.”
The learning: Misrepresenting authorities to a court is a severe ethical breach. While AI presents new challenges, the core principles of accuracy and candor still apply. Lawyers and litigants would be wise to tread carefully with generative AI tools in legal work for now, lest they find themselves in Karlen’s shoes – stuck with a hefty sanctions bill rather than a successful appeal.
What do you think about this case and its implications? Have you seen other examples of generative AI being misused in legal proceedings or other professional contexts? I’d love to hear your thoughts in the comments.
Be Audit-Secure!
Lisa Smith, SPHR, SCP
Note: This blog post is for informational purposes only and should not be construed as legal advice. Always consult with a legal professional for advice specific to your situation.
Sign-up HERE and Save $170!
Here is what all you will get:
- Boss Calls™ – Access to EVERY Boss Call™ – Past & Future.
- HelpDesk for HR VAULT – Access to all 8 of our proprietary tools and applications to make your workday simple.
- Forms, Docs, Policies and Procedures Library – 700+ samples you can download and edit to fit your needs.
- U.S. ePoster Club – Download state, city, and local posters. Both required & recommended, for all 50 states & D.C.
- Same-day email support – Write to our team of SPHR and SCP professionals with all your HR questions.