US | Incident Report
Hallucination
2025-12-15
Federal judge sanctions Hagens Berman over ChatGPT-hallucinated citations
AI Model: ChatGPT (OpenAI)
I. Executive Summary
Reuters reported that U.S. District Judge Fred Slaughter (Santa Ana, California) sanctioned plaintiffs’ counsel in an OnlyFans-parent lawsuit for AI-related citation errors. The court imposed $13,000 in total sanctions on Hagens Berman, partner Robert Carey, and co-counsel Celeste Boyd after multiple briefs contained “hallucinated” authorities. The reporting indicated Boyd used ChatGPT and did not verify outputs before filing. The judge declined to allow amended briefs as a cure and dismissed the case while allowing an amended complaint.
II. Key Facts
- Multiple briefs included AI-generated (hallucinated) legal citations and/or support.
- The court found the filings were not “warranted by existing law” and imposed monetary sanctions totaling $13,000.
- Reporting identified ChatGPT as the tool used by co-counsel without adequate verification.
- The court rejected an attempt to amend/correct the filings as insufficient to undo the harm.
- The case was dismissed with leave to amend the complaint.
III. Regulatory & Ethical Implications
Reinforces that courts will treat unverified generative-AI outputs as a professional responsibility and procedural compliance issue, not a mere “drafting mistake.” The sanction posture signals heightened scrutiny of supervision, verification, and quality controls (human-in-the-loop) when AI assists in legal research or briefing, with cost exposure for firms and individual lawyers.