A PYMNTS Company

US Judge Questions Pentagon Blacklisting of Anthropic in AI Dispute

 |  March 25, 2026

A U.S. judge on Tuesday (March 24) raised concerns about the Pentagon’s decision to blacklist artificial intelligence company Anthropic, suggesting the move may have been intended to penalize the firm for publicly voicing objections about the military use of AI technology.

    Get the Full Story

    Complete the form to unlock this article and enjoy unlimited free access to all PYMNTS content — no additional logins required.

    yesSubscribe to our daily newsletter, PYMNTS Today.

    By completing this form, you agree to receive marketing communications from PYMNTS and to the sharing of your information with our sponsor, if applicable, in accordance with our Privacy Policy and Terms and Conditions.

    According to Reuters, U.S. District Judge Rita Lin, presiding in San Francisco, indicated during a hearing that the designation of Anthropic as a national security supply-chain risk appeared unusually punitive. The label, typically used to flag companies that could expose military systems to sabotage or infiltration, has rarely been applied to a U.S.-based firm.

    The dispute stems from a lawsuit filed by Anthropic in California federal court, where the company argues that Defense Secretary Pete Hegseth exceeded his authority by imposing the designation. Per Reuters, the classification followed Anthropic’s refusal to allow its Claude AI system to be used in military surveillance or autonomous weapons programs.

    Judge Lin commented that the decision “looks like an attempt to cripple Anthropic,” according to Reuters. She also said, “It looks like DOW is punishing Anthropic for trying to bring public scrutiny to this contract dispute,” referencing President Donald Trump’s renaming of the Defense Department as the Department of War.

    The hearing focused on Anthropic’s request for a temporary order to block the designation while the case proceeds. Lin stated she would issue a written ruling in the coming days.

    Anthropic has argued that its AI models are not sufficiently reliable for deployment in autonomous weapons and that domestic surveillance applications raise serious civil liberties concerns. The company claims the designation could cost it billions of dollars in lost contracts and damage its reputation.

    Read more: Trump Administration Defends Pentagon Blacklisting of AI Firm Anthropic in Court Filing

    According to Reuters, the supply-chain risk label used against Anthropic marks the first known instance of such a designation being publicly applied to a U.S. company under a procurement law intended to protect military systems from foreign threats.

    In its lawsuit, Anthropic alleges that the government retaliated against it for expressing views on AI safety, violating its First Amendment rights. The company also contends that it was denied the opportunity to challenge the designation, which it says infringes its Fifth Amendment right to due process.

    During the hearing, Anthropic’s attorney Michael Mongan argued that the Pentagon misapplied procurement law in response to a contract dispute. “The logical implication of their position here is they can point to their frustrations in a contract negotiation, the stubbornness of the vendor, and say, ‘because you’re working in an area that touches national security, we’re going to tell the world that we think you might come around in the future and sabotage our systems,’” he said.

    Government lawyers defended the move, arguing that Anthropic’s resistance to certain uses of its technology raised concerns about reliability in critical military operations. Justice Department attorney Eric Hamilton said, “What happens if Anthropic, through an update, installs a kill switch or installs functionality that allows it to change how the software is functioning when our warfighters need it most? That is an unacceptable risk.”

    Per Reuters, Anthropic is also pursuing a separate case in Washington, D.C., challenging another Pentagon designation that could bar it from civilian government contracts.

    Source: Reuters