Share this @internewscast.com
A lawyer in California faced court sanctions after it was revealed that he had been citing non-existent cases, entirely fabricated by artificial intelligence (AI) software.
The Daily Journal reported:
By now, most attorneys are aware of AI’s limitations, especially its propensity to generate fictitious citations and unreliable sources. The California Court of Appeal has now clarified that these “hallucinations” can have serious repercussions.
In the case of Noland v. Land of the Free, L.P. (B331918, Sept. 12, 2025), Division Three of the Second District fined the plaintiff/appellant’s lawyer $10,000 and reported him to the state bar. The court found that most of the legal references in his appellate briefs were concocted by generative AI tools and had never been mentioned in any real case.
The Court made its intentions clear in publishing the opinion: “We … publish this opinion as a warning. To put it simply, no brief, pleading, motion, or any court document should include any citations—whether generated by AI or any other source—that the responsible attorney has not personally read and confirmed.” (emphasis in original).
AI not only creates fake cases, but it can also invent fictitious events. Recently, ChatGPT falsely accused legal scholar and Fox News commentator Jonathan Turley of sexual harassment by a student — an incident that never occurred, neither the harassment nor the accusation.