Anthropic said it will expand its use of Google Cloud to access up to one million Tensor Processing Units (TPUs) and “well over a gigawatt” of computing capacity coming online in 2026 to train future Claude models.

The multiyear expansion is valued in the tens of billions of dollars. Anthropic cited TPUs’ price-performance and efficiency, noting it already trains and serves Claude on Google’s in-house chips available via Google Cloud.

The scale highlights the sector’s race for compute. OpenAI has recently signed multiple deals that may cost over $1 trillion to secure about 26 gigawatts of capacity, roughly the power used by 20 million U.S. homes. Industry estimates peg 1 GW of AI compute at ~$50 billion.

Beyond chips, Google will provide additional cloud services under the agreement. Separately, Reuters reported this month that Anthropic expects its annualized revenue to more than double, potentially nearly triple, next year. Coding tools such as Cursor also support Anthropic’s Claude models for enterprise development workflows.