‘Dark horse’ Google is now giving Nvidia a run for its money in the AI race

“We’ve taken a full, deep, full-stack approach to AI,” Sundar Pichai, chief executive officer for Google and Alphabet, told investors last quarter. “And that really plays out.”
Loading
Any concerns that Google might be held back by regulators are dying away. The company recently avoided the most severe outcome from a US anti-monopoly case — a breakup of its business — in part because of the perceived threat from AI newcomers. And the search giant has shown some progress in the longtime effort to diversify beyond its core business. Waymo, Alphabet’s driverless car unit, is coming to several new cities and just added freeway driving to its taxi service, a feat made possible by the company’s enormous research and investment.
Some of Google’s edge comes from its economics. It’s one of the few companies that produces what the industry calls the full stack in computing. Google makes the AI apps people use, like its popular Nano Banana image generator, as well as the software models, the cloud computing architecture and the chips underneath. The company also has a data goldmine for constructing AI models from its search index, Android phones and YouTube — data that Google often keeps for itself. That means, in theory, Google has more control over the technical direction of AI products and doesn’t necessarily have to pay suppliers, unlike OpenAI.
Several tech companies, including Microsoft and OpenAI, have plotted ways to develop their own semiconductors or forge ties that make them less reliant on Nvidia’s bestsellers.
For years, Google was effectively its own sole customer for its homegrown processors, called tensor processing units, or TPUs, which the company designed more than a decade ago to manage complex AI tasks. That’s changing. AI startup Anthropic said in October said it would use as many as 1 million Google TPUs in a deal worth tens of billions of dollars.
Google is giving ChatGPT creator OpenAI and chipmaker Nvidia a run for their money.Credit: iStock
On Monday, tech publication The Information reported that Meta planned to use Google’s chips in its data centres in 2027. Google declined to address the specific plans, but said that its cloud business is “accelerating demand” for both its custom TPUs and Nvidia’s graphics processing units. “We are committed to supporting both, as we have for years,” a spokesperson wrote in a statement.
Meta declined to comment on the report on Monday night.
“We’re delighted by Google’s success,” a spokesperson for Nvidia said in a statement on Tuesday. “They’ve made great advances in AI, and we continue to supply to Google.” The spokesperson added: “Nvidia is a generation ahead of the industry – it’s the only platform that runs every AI model and does it everywhere computing is done.”
Analysts read the Meta news as a signal of Google’s success. “Many others have failed in their quest to build custom chips, but Google can clearly add another string to its bow here,” Ben Barringer, head of technology research for Quilter Cheviot, wrote in an email.
Gemini 3 Pro has risen to the top of closely watched AI leaderboards on LMArena and Humanity’s Last Exam. Andrej Karpathy, a founding member of OpenAI, said it’s “clearly a tier 1 LLM,” referring to large language models. Google pitched the model as one that can solve complex science and math problems, and address nagging issues — such as generating images and overlaid text with incorrect spelling — that might deter enterprise customers from adopting AI services more widely.
Consumer interest is harder to gauge. Google said last week that 650 million people use its Gemini app. OpenAI recently said ChatGPT hit 800 million weekly users. As of October, Gemini’s app had 73 million monthly downloads, well shy of ChatGPT’s 93 million monthly downloads, according to research firm Sensor Tower.
Google is an advertising behemoth, but it has historically struggled to find other commercial models. Its cloud business reported third-quarter revenue of $US15.2 billion, up 34 per cent from the prior year. Still, that remains in third-place behind Microsoft and Amazon Web Services, which posted more than double Google’s cloud sales in the most recent quarter. Counterpoint Research’s Shah said Google’s AI adoption with enterprises lags Microsoft and Anthropic.
Meanwhile, OpenAI is targeting profits by selling a premium version of ChatGPT and adjacent software to companies. It’s cutting deals with chipmakers from Broadcom to Advanced Micro Devices to Nvidia to support its AI ambitions.
Loading
Google’s TPUs are mostly attractive to a handful of companies with big computing bills, like Meta and Anthropic, said Meryem Arik, CEO of the AI startup Doubleword.
And the chip industry is “not a zero-sum game with just one winner,” said Barringer.
For one, AI developers can only access Google’s chips through the company’s own cloud service. They can use Nvidia’s graphics processing units, or GPUs, more flexibly. “As soon as you use TPUs, you’re locked into” the Google cloud ecosystem, said Arik.
Being tied to a single supplier might have been something companies avoided. That’s no longer the case for Google, thanks to its advances in AI.
“It’s definitely fair to say that Google is back in the game with Gemini 3,” said Thomas Husson, analyst at Forrester. “In fact, to paraphrase a quote attributed to Mark Twain, reports of Google’s death have been widely exaggerated, not to say irrelevant.”
Bloomberg
The Business Briefing newsletter delivers major stories, exclusive coverage and expert opinion. Sign up to get it every weekday morning.



