Anthropic has signed a memorandum of understanding with the Australian government that gives Canberra access to data about how Australians use Claude. The agreement, signed April 1 in Canberra by Dario Amodei, Anthropic's CEO, and Australian Senator Tim Ayres, according to the Australian government's MOU filing, formalizes a three-part deal: collaboration with Australia's AI Safety Institute, sharing of the company's Economic Index data, and AUD$3 million in API credits for four Australian research institutions. The MOU carries no legal weight, according to the Australian government's own filing, which states explicitly that it is not intended to confer any preferential treatment in procurement or regulatory decisions.
The Economic Index is Anthropic's proprietary metric for tracking AI adoption across economies. Under the agreement, Australia will receive this data to monitor how AI is being adopted in sectors including natural resources, agriculture, healthcare, and financial services, along with its effects on workers. The framing from Anthropic's side is straightforward: Australia is a significant per-capita adopter of Claude, ranking seventh globally despite being only the eleventh-largest national market by absolute traffic, according to Anthropic's own research on Australian usage. Australians use the model 4.1 times more than their working-age population would predict, and their use cases are unusually diverse, with personal and work-related conversations split nearly evenly at 47% and 46%.
What Australia is actually getting, though, is visibility into metrics that Anthropic built from Australians' own activity. The data belongs to Anthropic. The MOU does not transfer ownership or impose ongoing disclosure obligations. It is a snapshot and a statement of intent, not a regulatory requirement.
The agreement mirrors arrangements Anthropic has with the AI safety institutes of the United States, United Kingdom, and Japan. Like those deals, it includes joint safety evaluations and academic research collaboration. But Australia brings something the others do not: a government with no AI legislation. Australia currently regulates AI under existing laws and voluntary guidelines, meaning there is no statutory framework requiring Anthropic to share model information, report incidents, or submit to external audit. The MOU fills none of that gap.
Also unresolved is the copyright deadlock that has blocked deals to use Australian data centers for model training. Any plan to conduct training work locally remains constrained by an ongoing negotiation between the Australian government and AI companies over licensing terms for training data, the Sydney Morning Herald reported. The MOU does not touch that problem.
Anthropic is also in a legal fight with the Trump administration after the Pentagon designated it a supply chain risk, cutting off federal contracts worth hundreds of millions of dollars. Anthropic sued, challenging the designation, and a judge has temporarily blocked the designation. The company has hired three lobbying firms in Canberra, according to the Sydney Morning Herald: Anacta Strategies, SEC Newgate, and Policy Australia.
Alongside the safety agreement, Anthropic announced a deep tech startup API program offering up to USD 50,000 in credits to VC-backed companies working in drug discovery, materials science, climate modeling, and medical diagnostics. The AUD$3 million in institutional partnerships went to the Australian National University, the Murdoch Children's Research Institute, the Garvan Institute of Medical Research, and Curtin University. Anthropic also said it is exploring data center infrastructure investments in Australia aligned with the government's recently announced data center policy, and that it plans to open a Sydney office.
The company closed a US$30 billion Series G funding round in February, valuing it at US$380 billion. What Australia gets is a cooperative relationship with no governance rights attached. The MOU signals willingness to share data. It does not require Anthropic to keep sharing it.