scorecardresearch
Add as a preferred source on Google
Wednesday, November 26, 2025
Support Our Journalism
HomeTechChip wars: How Google’s TPUs are giving Nvidia a run for its...

Chip wars: How Google’s TPUs are giving Nvidia a run for its money

Both types of chip can handle the large number of computations involved in training AI models, but they achieve this in different ways.

Follow Us :
Text Size:

For as long as Nvidia Corp. has dominated the market for artificial intelligence chips, customers have made clear they’d like to see more competition. It turns out one of the most formidable alternatives may have been hiding in plain sight.

Google released its tensor processing units a decade ago to help speed up the company’s web search engine and boost efficiency. They were later adapted for machine learning tasks in Google’s AI applications.

Now the company is securing major deals for TPUs, suggesting they may be a credible alternative to Nvidia’s AI accelerators for training and operating today’s complex large language models. Those are called graphics processing units (GPUs).

Here’s more on TPUs, how they work, their promise and their limitations.

What’s the difference between a GPU and a TPU?

Both types of chip can handle the large number of computations involved in training AI models, but they achieve this in different ways. Nvidia’s GPUs were originally developed to render video game images in a lifelike way, processing multiple tasks in parallel via thousands of computing “cores.” This architecture also allows them to carry out AI tasks at speeds that can’t be matched by rival technologies.

TPUs were built specifically for a type of AI-related work known as matrix multiplication, which is the main operation involved in training the neural networks that generate responses to prompts in AI chatbots such as OpenAI’s ChatGPT and Anthropic PBC’s Claude. A lot of that work involves repetitive computations performed sequentially rather than in parallel. TPUs were developed to perform these tasks efficiently. They are seen as less adaptable and more specialized than Nvidia GPUs, but also less power-hungry when running those operations. Nvidia GPUs are viewed as more adaptable and programmable, but this flexibility can make them costlier to operate.

How did TPUs emerge as an AI contender?

Google began working on its first TPU in 2013 and released it two years later. Initially, it was used to speed up the company’s web search engine and boost efficiency. Google first began putting TPUs in its cloud platform in 2018, allowing customers to sign up for computing services running on the same technology that had boosted the search engine.

The chips were also adapted to support Google’s inhouse development of AI. As the company and its DeepMind unit developed cutting-edge AI models like Gemini, it was able to take lessons from its AI teams back to the TPU chip designers, who in turn were able to customize the chips to benefit the in-house AI teams.

The latest version of Google’s TPU, called Ironwood, was unveiled in April. It’s liquid-cooled and designed for running AI inference workloads — meaning using the AI models rather than training them. It’s available in two configurations: a pod of 256 chips or an even larger one with 9,216 chips.

TPUs can perform better than GPUs for some AI work as Google can “strip out a lot of other parts of the chip” that aren’t tailored to AI, said Seaport analyst Jay Goldberg, who has a rare sell rating on Nvidia shares. Now in its seventh generation of the product, Google has improved performance of the chips, made them more powerful and lowered the energy required to use them, which makes them less expensive to run.

Who wants TPUs?

Current TPU customers include Safe Superintelligence — the startup founded last year by OpenAI co-founder Ilya Sutskever, as well as Salesforce Inc. and Midjourney, alongside Anthropic.

Under a deal unveiled in October, Anthropic would gain access to more than a gigawatt of Google computing power via as many as 1 million TPUs. The following month, The Information reported that Meta Platforms Inc. was in discussions to use Google TPUs in its data centers in 2027.

The developments underscore how major AI names are embracing TPUs as they race to add computing power to cope with runaway demand.

What are the prospects for more TPU sales? 

The biggest AI developers are spending tens of billions of dollars on expensive Nvidia chips, and they’re anxious to temper that dependence and to mitigate the impact of shortages — pointing to a big potential market for TPUs.

For now, businesses that want to use Google TPUs have to sign up to rent computing power in Google’s cloud. That may soon change. The Anthropic deal makes an expansion into other clouds more likely, said Bloomberg Intelligence analysts.

No one, including Google, is currently looking to replace Nvidia GPUs entirely; the pace of AI development means that isn’t possible right now. Google is still one of Nvidia’s biggest customers despite having its own chips because it has to maintain flexibility for customers, said Gaurav Gupta, an analyst at Gartner. If a customer’s algorithm or model changes, GPUs are better suited to handle a wider range of workloads. “Nvidia is a generation ahead of the industry,” according to an Nvidia spokesperson. “We’re delighted by Google’s success — they’ve made great advances in AI, and we continue to supply to Google.”

Even the tech companies that are signing up for TPUs are still committing heavily to Nvidia chips. Anthropic, for example, announced a big deal with Nvidia weeks after the Google TPU tie-up. The best hope for Google’s TPUs may be that they end up as part of the basket of products required to power the growth of AI.

(Reporting by Dina Bass)

Disclaimer: This report is auto generated from the Bloomberg news service. ThePrint holds no responsibility for its content.


Also Read: Nvidia worth $5 trillion: What does it mean for the market?


 

Subscribe to our channels on YouTube, Telegram & WhatsApp

Support Our Journalism

India needs fair, non-hyphenated and questioning journalism, packed with on-ground reporting. ThePrint – with exceptional reporters, columnists and editors – is doing just that.

Sustaining this needs support from wonderful readers like you.

Whether you live in India or overseas, you can take a paid subscription by clicking here.

Support Our Journalism

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular