Anthropic won a a preliminary injunction to prevent the US Department of Defense from labeling it supply chain riskthe possibility of clearing the way for customers to start working with the company again. Thursday’s decision by Rita Lin, a federal district judge in San Francisco, is a setback for the Pentagon and a big boost for generative AI companies as they try to preserve business and characteristics.
“Defendants’ designation of Anthropic as a ‘supply chain risk’ is arguably unlawful and arbitrary and capricious,” Lin said. he wrote in justifying temporary relief. “The War Department provides no legitimate basis for inferring from Anthropic’s direct insistence on restrictions on use that it may be criminal.”
Anthropic and the Pentagon did not immediately respond to requests for comment on the decision.
The Department of Defense, which under Trump calls itself the War Department, has relied on Anthropic Claude AI tools for writing sensitive documents and analyzing classified data over the past few years. But this month, it started pulling the plug on Claude after he decided to be Anthropic he could not be trusted. Pentagon officials cited numerous cases in which Anthropic allegedly placed or sought to place restrictions on the use of its technology that the Trump administration found unnecessary.
Ultimately the administration issued several orders, including designating a dangerous supply company, which has had the effect of slowly halting Claude’s use in the federal government and harming Anthropic’s sales and public reputation. The company filed two lawsuits challenging the restrictions as unconstitutional. In Tuesday’s hearing, Lin said the government It was considered illegal to “crime” and “punish” Anthropic.
Lin’s decision on Thursday “restores the status quo” to February 27, before the order was issued. “It does not preclude any defendant from taking any lawful action that would have been available to him” on that date, he wrote. “For example, this order does not require the Department of War to use Anthropic products or services and does not prevent the Department of War from other providers of artificial intelligence services, as long as those actions are consistent with applicable constitutional regulations, rules and provisions.”
The ruling suggests the Pentagon and other federal agencies are still free to cancel contracts with Anthropic and ask contractors that include Claude in their own tools to stop doing so, but without citing supply chain risk selection as the basis.
The immediate effect is unknown because Lin’s prescription will not work for a week. And a federal appeals court in Washington, DC, has yet to rule on a second case filed by Anthropic, which focuses on a different law where the company was also barred from providing software to the military.
But Anthropic can use Lin’s decision to show it to other customers it deals with and work with industry player so that the law will be on his side for a long time. Lin has not set a timetable for making a final decision.





