
The most severe professional sanction yet handed down for AI misuse in a legal proceeding landed on April 15, 2026, when the Nebraska Supreme Court suspended Omaha attorney Greg Lake indefinitely after a court brief he submitted contained 57 defective citations out of 63, including 20 fully fabricated case references that do not exist in any jurisdiction.
AI hallucinations news has produced its most severe professional sanction in the United States to date. Of 63 citations Lake made, 57 contained some form of defect. Twenty were what courts now call hallucinations: realistic-seeming but entirely fabricated references generated by an AI model that guessed plausibly at what was being requested and produced convincing-looking but nonexistent citations. Crypto News
The brief had been filed in a divorce appeal. Lake initially told justices at oral argument that he had uploaded the wrong version of a brief while traveling on his wedding anniversary with a broken computer. He later admitted to using AI.
The Problem Goes Well Beyond One Case
This is not an isolated incident. Researcher Damien Charlotin at HEC Paris, who maintains a database of AI hallucination cases in legal proceedings, now tracks more than 1,200 such cases globally, with approximately 800 from US courts. He has described the pace as reaching "ten cases from ten different courts on a single day." Crypto News
A federal court set a record last month with an order for a lawyer in Oregon to pay $109,700 in sanctions and costs for filing AI-generated errors. The Sixth Circuit imposed a $30,000 fine on two Tennessee attorneys, the largest federal appellate sanction yet linked to fabricated citations. NPR
Why This Keeps Happening
The pattern is consistent across cases: an attorney uses AI to draft or research a brief, the AI generates plausible-sounding but nonexistent case citations, the attorney fails to verify them, and the brief reaches a court. The Nebraska court noted that the mistakes "could have been easily discovered using traditional legal research services." NPR
The problem is not that AI is being used. The problem is that it is being used as a shortcut that eliminates the verification step.
What This Means for Your Business
The Nebraska case is the clearest signal yet that AI hallucination liability is moving from embarrassment to career-ending consequences. For any organization using AI in document preparation, contract drafting, research, or compliance work - the verification step is non-negotiable. AI tools used with Grammarly for final review and fact-checking layers can help catch errors before they cause damage. The standard for care is not "did AI write it" but "was it verified by a human who takes responsibility for accuracy." Courts, regulators, and boards are all moving toward that standard simultaneously.



