Data Center Operators Reject Google's TPU Chip Despite Nvidia Dominance Challenge
Google's push to sell its 8th-generation Tensor Processing Unit (TPU) chips to external data centers has stalled. Leading AI infrastructure providers-Nebius, Lambda, and CoreWeave-said they have no immediate plans to adopt the chips, according to reporting from The Information.
Google developed TPUs for internal use with its Gemini AI models. After Gemini 3 outperformed competing systems, the company began offering TPUs to external customers. The market response has been lukewarm.
Nvidia's Lock-In Effect
Nvidia GPUs remain the default choice for data center operators building AI infrastructure. Marc Boroditsky, Nebius' chief revenue officer, said 99% of his company's customer demand is for Nvidia GPUs. Nick Robbins, CoreWeave's vice president of corporate development, put it plainly: "If 99% of the market wants GPUs, even if that demand drops to 90%, we'll still focus on GPUs."
Lambda's chief financial officer, Chuck Fisher, emphasized his company's commitment to Nvidia by saying, "We have green blood at Lambda"-a reference to Nvidia's corporate color.
Nvidia is both a key supplier and major investor in many of these data center operators, creating financial and operational ties that discourage switching.
Google's Financing Workaround
Google is attempting to overcome resistance through financing arrangements. In February, the company agreed with a major investment firm to establish a joint venture leasing TPUs to customers. Google is also discussing options with financial partners to create a special purpose vehicle that would purchase and lease TPUs directly.
These moves suggest Google recognizes that upfront chip costs and switching costs are barriers to adoption-problems that financing might address.
For sales professionals focused on AI infrastructure, this market dynamic illustrates how supplier relationships and existing customer commitments can outweigh technical performance. Understanding AI for Sales in the infrastructure space means recognizing that purchasing decisions often depend on ecosystem lock-in rather than product features alone. The competitive dynamics in Generative AI and LLM infrastructure remain shaped by vendor relationships established over years.
Your membership also unlocks: