scorecardresearch
Add as a preferred source on Google
Saturday, November 22, 2025

Why Does Everybody Want More AI Chips?

Date:

Share post:

For a new and revolutionary Generative Artificial Intelligence (GenAI) app to function, three things have to be in place. Firstly, you’ll needa chip like Nvidia’s famed GPU (graphics processing unit) that can train and run an AI model. Next, you must have thatAI model, an example of which would be Amazon’s Bedrock, and this is the foundation on which the app will stand. Finally, you need the app that will be presented to the consumer,for instance, OpenAI’s ChatGPT.

On that first level of the chip, Nvidia is indisputably the most powerful presence in the market, commanding about 80% of the share. Makers of AI software like ChatGPT and Google’s Bard use large AI models, which need Nvidia’s chips to teach them. The education process sometimes goes on for months, using thousands of GPUs, and then the AI model is used, through software, to create generative content by means of something called inference. 

The new GH 200 chip unveiled by Nvidia in August is special because itpossesses, not only the same level of GPU as their existing H100 chip,but also 141 gigabytes of memory to go with it. The expanded memory capacity suits it for the task of inference and, as a result of the new gadget, “The inference cost of large language models will drop significantly”, in the view of Nvidia CEO Jensen Huang.

Nvidia might be the biggest mover and shaker in the AI chip market at the moment, but they’re being challenged for some more of that precious market share by some well-known tech names. Join us for the story, especially if you like CFD stock trading in Big Tech on the iFOREX platform.

AMD

Back in May, it was reported that Microsoft Corp. was offering engineering resources to AMD (Advanced Micro Devices) in support of their expansion into the AI chip business. In the following month, AMD announced that their own home-made GPU, called the M1300X, would be ready for limited shipping by the end of the year. Until now, AMD has been the go-to address for regular computer processers, but company CEO Lisa Su is certain that AI will be AMD’s “largest and most strategic long-term growth strategy”. One thing that could result from the new competition with Nvidia is a fall in prices from sums like $30,000, which is what it’ll cost you to buy Nvidia’s H100. 

Alphabet

In Google’s DeepMind segment, they’re interested, not just in producing AI chips, but in using AI to design these chips in more efficient ways. The way chips have traditionally grown more advanced is through the addition of more transistors, but this approach has its limits. It seems that the way ahead will be with smaller chips, geared to performing specific tasks. Self-driving cars, drones, and ChatGPT rely on this kind of chip for their functioning. And these are what DeepMind wants to churn out, with the help of AI. This could be a revolutionary development in the industry because, when done by humans, AI chip designs tend to emerge at the rate of about one in several weeks. By contrast, DeepMind could potentially produce thousands of designs in a single week. 

IBM

IBM want to make AI chips with more economical power use, which means that “large and more complex workloads could be executed in low power or battery-constrained environments” like mobile phones, cars, and cameras, says IBM scientist Thanos Vasilopoulos. Instead of the normal method of storing information in the form of 1s and 0s, IBM’s new ‘memristor’ chip uses other numbers too. When these are used to make a network, it resembles a human brain in some ways. Specifically, a memristor can recall its own electronic history in the way a human synapse can. Specialists in the field mention, however, that making a computer out of memristors could run into problems on the manufacturing and cost fronts.

The Landscape Ahead

The wave of enthusiasm about GenAI could bring a potential recovery to Amazon’s most significant segment, Amazon Web Services (AWS), which has seen revenue growth shrink for the last seven quarters. AWS’s cloud services are needed for GenAI, and Amazon have also come out with their own AI chips, named Titanium and Inferentia. In addition, the firm has a hand in building AI applications, for instance theirCodeWhisperer. 

The AI chip market looks set to be an arena for stiff competition between the likes of Nvidia, AMD, Google, and Microsoft. Watch out for this if you’re going to be CFD stock trading with these big names on the iFOREX platform. Using iFOREX’s celebrated mobile trading app, you can trade in the price movements of tech company shares (whether up or down), but also of hundreds of other instruments, including commodities, cryptocurrencies, forex pairs, and more.

ThePrint ValueAd Initiative content is a paid-for, sponsored article. Journalists of ThePrint are not involved in reporting or writing it.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Related articles

Crypto Set for Major Rebound in H2 2025: Ethereum (ETH), Little Pepe (LILPEPE), and 4 Other Tokens to Watch Closely

Little Pepe ($LILPEPE) is not your typical meme coin. It’s the native utility token of a next-gen Layer 2 blockchain ecosystem built exclusively for meme culture.

Shiba Inu (SHIB) or Little Pepe (LILPEPE): Which One is 2025’s Meme Coin Millionaire Maker?

When most people think about meme coins, Shiba Inu (SHIB) is one of the first names that springs...

Buy Bumper to Bumper Car Insurance for Complete Peace of Mind

Bumper-to-bumper insurance is an add-on cover available with Comprehensive Car Insurance Policy and Standalone Own Damage Policy. It offers 100% coverage on the cost of replaced car parts—like rubber, plastic, fibre, and metal—without deducting any depreciation amount.

Crypto Coins to Buy Today: BlockDAG, LINK, XMR & VET

Explore the best crypto coins to buy today: BlockDAG’s $0.0020 presale before June 25, LINK near support, XMR gaining strength, and VET’s coming upgrade.