Skip to main content

March 12, 2026

Microsoft and 22 retired generals back Anthropic's court fight against Pentagon

Wikipedia
Federal News Network
Colorado Springs Gazette
Sherwood News
Stars and Stripes

22 retired generals say Hegseth abused national security law to punish an AI company

On March 11, 2026, Microsoft filed an amicus curiae brief in San Francisco federal court asking Judge Rita Lin to block the Pentagon''s supply chain risk designation against Anthropic. Microsoft told the court that the designation forces government contractors to comply with vague and ill-defined directions that have never before been publicly wielded against a U.S. company. Microsoft said the use of a supply chain risk designation to address a contract dispute may bring severe economic effects that are not in the public interest. The filing asked the court to issue a temporary restraining order blocking the designation to allow for more reasoned discussion. Microsoft also formally endorsed Anthropic''s two ethical red lines: that AI should not be used for domestic mass surveillance, and that humans should remain in the loop for high-stakes automated decisions.

A separate amicus brief was filed on the same day by a group of 22 retired senior U.S. military officials, including former CIA Director Michael Hayden and retired Coast Guard Admiral Thad Allen, who led the federal government''s response to Hurricane Katrina. Their brief explicitly set aside the substance of the Anthropic-Pentagon dispute and focused on what the group characterized as a fundamental abuse of government authority. Something more basic is at stake: the misuse of powerful national security authorities by civilian political leadership, not to address the serious concerns that led Congress to delegate the authority in question, but as retribution against a private company that has displeased the leadership. Far from protecting U.S. national security, the Secretary''s conduct here threatens the rule-of-law principles that have long strengthened our military.

Earlier supporting briefs had been filed by AI developers at Google and OpenAI. A fourth amicus brief came from the Cato Institute and the Electronic Frontier Foundation. The spectrum of organizations — ranging from the Cato Institute on the right to EFF on the left, and from Microsoft as a corporate interest to retired military leaders as national security veterans — illustrated the breadth of opposition to the Pentagon''s legal theory. The case is before Judge Rita Lin in San Francisco, with a hearing scheduled for March 24, 2026.

The backstory of the dispute runs through Anthropic''s $200 million Pentagon contract negotiated under the Biden administration. That contract allowed Claude to be deployed across classified military networks but included two restrictions: Claude could not be used for mass surveillance of American citizens, and it could not be used to control fully autonomous weapons without meaningful human oversight. The rupture began in late January 2026 when Anthropic started pressing the Pentagon for details about how Claude had been used during the U.S. military operation that led to the capture of Venezuelan President Nicolás Maduro. The DOD insisted Anthropic must allow all lawful uses without restrictions. Anthropic CEO Dario Amodei said the company cannot in good conscience accept that condition.

The retired military officers'' brief carries particular weight because it directly contests the Pentagon''s framing that the Anthropic dispute is a national security matter. The officers argue it is the opposite — that abusing national security authorities to coerce a private company''s product policies is itself a threat to the rule of law that the military exists to defend. The list of signatories includes Admiral C. Steve Abbot, Admiral Thad W. Allen, and Vice Admiral Donald C. Arthur, among 19 others.

Microsoft''s brief also raised an operational concern: removing Anthropic''s Claude from military systems while Operation Epic Fury is underway could hamper U.S. warfighters at a critical point in time. This argument created a paradox for the Pentagon — it is simultaneously using Anthropic''s AI in active combat operations against Iran while designating Anthropic as a supply chain risk.

The case before Judge Lin represents a significant test of how far executive branch national security authority can extend into the commercial regulation of private AI companies. Legal analysts across the political spectrum have noted the legal theory is weak because Congress delegated supply chain authority for adversarial national security threats, not contract disagreements. OpenAI CEO Sam Altman issued a memo to staff saying OpenAI shares Anthropic''s red lines on mass surveillance and autonomous weapons, while simultaneously negotiating with the Pentagon over guardrail terms.

🤖AI Governance📜Constitutional Law🛡️National Security🏛️Government

People, bills, and sources

Michael Hayden

Former CIA Director; Retired U.S. Air Force General

Thad Allen

Retired Admiral, U.S. Coast Guard; Principal Federal Official for Hurricane Katrina Response; 23rd Commandant of the U.S. Coast Guard

Rita Lin

U.S. District Judge, Northern District of California (San Francisco)

Dario Amodei

CEO, Anthropic

Pete Hegseth

Pete Hegseth

U.S. Secretary of Defense

Sam Altman

CEO, OpenAI

What you can do

1

civic education

Track the Anthropic v. Pentagon court case through PACER federal court records

Microsoft and 22 retired military leaders including former secretaries of the Air Force, Army, Navy, and a former Coast Guard head have filed amicus briefs supporting Anthropic's court fight against the Pentagon. Defense Secretary Pete Hegseth designated Anthropic as a supply chain risk, blocking military contracts after the company refused to allow its AI for mass surveillance and fully autonomous weapons. The retired generals allege this is retribution against a company that displeased leadership. The case before Judge Rita Lin in San Francisco has a March 24, 2026 hearing on a temporary restraining order.

Go to pacer.uscourts.gov and search for case filings in the Northern District of California — San Francisco Division. Look for Anthropic v. Pentagon or similar case name. The March 24 hearing will address Anthropic's request for a temporary restraining order blocking the supply chain risk designation. Microsoft filed an amicus brief challenging Defense Secretary Hegseth's action. Twenty-two retired military leaders including former service secretaries allege this is retribution against Anthropic for refusing to allow AI for mass surveillance and fully autonomous weapons. Court filings are publicly available and show the conflict between corporate ethics and government pressure.

2

advocacy

Contact your senators to support legislation defining limits on Pentagon supply chain risk designations against domestic companies

The supply chain risk management authority was designed to exclude adversarial foreign vendors. There is no current law explicitly preventing its use against domestic companies in commercial disputes.

Hello, my name is [name] and I''m a constituent from [city]. Does [Senator''s name] believe the Pentagon should be able to blacklist a U.S. company as a national security risk for refusing to remove AI safety guardrails? Does [Senator''s name] support legislation limiting supply chain risk authority to adversarial foreign vendors as Congress originally intended?