The US Tax Court is considering how to respond to the use of artificial intelligence in court filings after attorneys and litigants submitted briefs citing cases that do not exist, according to Judge Mark V. Holmes.
Holmes said the court is proceeding cautiously because more than three-quarters of its cases involve self-represented taxpayers rather than lawyers. He said that distinction affects how the court would approach any sanctions because pro se litigants are not subject to the same professional ethics rules that govern attorneys.
“The Tax Court wants to proceed gingerly,” Holmes said in an interview. “If a pro se person does it, it’s not really a violation of professional ethics because they’re not lawyers, and we would have to figure out some other way of determining what the appropriate sanction is.”
Holmes recently criticized an attorney for citing what appeared to be AI-generated precedents in Clinco v. Commissioner. In that order, he quoted Chief Justice John Roberts’ warning to lawyers against filing briefs that cite nonexistent cases.
Holmes said he has seen fewer than a dozen instances of hallucinated case citations in Tax Court filings and has not yet imposed sanctions in those matters.
The issue has also arisen in other federal courts. The US Court of Appeals for the Tenth Circuit imposed a $1,000 fine on a Maryland attorney after she acknowledged using generative AI to prepare a brief that cited nonexistent cases. The court also referred the matter to disciplinary authorities.
In another case, Johnson v. Dunn, three attorneys were disqualified after submitting a motion that included fabricated case citations.
The issue has extended to questions of privilege and confidentiality. In United States v. Heppner, decided in February, the Southern District of New York ruled that documents created by a defendant using public AI tools were not protected by attorney-client privilege or work-product doctrine because the tools functioned as third parties without confidentiality obligations.
Holmes said the Tax Court has additional concerns because its cases often contain sensitive taxpayer information. He said court filings regularly include Social Security numbers, bank account numbers, and identifying information for minor children.
“I’d say in almost 100% of the cases, Social Security numbers will pop up, bank account numbers will pop up, the names and identifying information of minor children will pop up,” Holmes said.
The court is considering whether to address AI use through its disciplinary committee, which handles attorney ethics, or through its rules committee. Holmes said possible changes could include amendments to court rules, though no specific proposal has been adopted.
“The rules committee might, for instance, amend the rules to require a certification that AI has not been used,” Holmes said. “But I’m not sure that would make any sense since there are perfectly legitimate uses of AI. We’re feeling our way in the dark still about where to go on this.”
Current law already gives the court authority to impose some penalties. Internal Revenue Code Section 6673 allows the Tax Court to impose penalties of up to $25,000 on parties who bring frivolous or groundless arguments or who institute proceedings primarily for delay.
Gilbert Rothenberg, a former chief of the Appellate Section in the Justice Department’s Tax Division, said that provision could offer one possible tool in some cases involving improper AI use. He said, however, that a filing containing fabricated AI-generated citations may not always fit neatly into rules aimed at frivolous claims.
Rothenberg said the court may be more likely to allow self-represented taxpayers to correct defective filings than to impose immediate sanctions.
The issue has also touched the Internal Revenue Service. In Khoja v. Commissioner, Judge Jennifer Siegel ordered a hearing after an IRS motion included a citation to a nonexistent case. The judge questioned the IRS attorney about the source of the citation and the safeguards in place in the office.
Katherine Jordan, tax controversy counsel at Miller & Chevalier, said the incident highlighted the need for meaningful human review of AI-generated material. She said the Tax Court is likely to see more AI use by both attorneys and self-represented litigants.
Holmes said the court is reviewing how other courts have responded while taking into account the particular nature of its docket. He said any response will need to address both fabricated citations and the handling of confidential taxpayer information.
The Tax Court has not announced a formal policy on AI-generated filings. Holmes said judges are continuing to discuss the issue as the court considers whether new rules or sanctions are needed.
Source: Tax Court Navigates AI Misuse Rules for Self-Represented Filers | Bloomberg Law