NVDA dips as Google-Meta talks hint at direct TPU sales and rising in-house silicon
Nvidia fell 2.6% after a report said Google is in active talks to supply Meta with its AI chips in 2027. That's a shift from Google's current model of renting TPU capacity via Google Cloud to placing TPUs directly in customer data centers.
The report also said Google is pitching TPU sales to other cloud clients, with an internal view that it could pull in revenue equal to roughly 10% of Nvidia's annual sales. AMD slipped more than 4% on the same headlines.
Why this matters for your models
- Hyperscalers are moving from buyer to competitor. Google, Amazon, and Microsoft now field credible in-house AI silicon that can pressure Nvidia's pricing power, mix, and long-term unit demand.
- Capex allocation could shift. If TPUs are sold into third-party racks, AI capex may tilt toward internal ASICs over general-purpose GPUs, changing the split of spend across compute, networking, and memory.
- CUDA moat vs. buyer leverage. Nvidia's software stack still commands a premium, but volume buyers with custom models can trade peak flexibility for cost and throughput. Expect tougher procurement cycles and more bespoke deals.
- Second-order effects: Watch networking and memory. Even if GPU share normalizes, high-speed Ethernet/InfiniBand and HBM demand stay tight if overall AI training capacity keeps expanding.
The setup behind the headline
Amazon has already scaled its own AI chips and reportedly rented about half a million units to Anthropic, signaling real customer willingness to adopt non-Nvidia silicon. Google announced a sizable partnership with Anthropic, and OpenAI tested Google's chips this summer.
Speculation around Google's chip ambitions has been building. One sell-side note in September flagged strong interest from top AI labs in buying TPUs outright and floated a sum-of-the-parts value that put Google's TPU operations and DeepMind near the trillion-dollar mark.
What Nvidia is saying
Nvidia congratulated Google's progress and emphasized it continues supplying chips to the company, while asserting it remains "a generation ahead" of peers. Management also pushed back on "circular AI deal" criticism, arguing its strategic investments are disclosed, portfolio companies are growing revenue, and demand for AI applications is genuine.
In a memo to analysts, Nvidia rejected comparisons to past accounting blowups, stating its business is economically sound and reporting is transparent, explicitly distancing itself from Enron, WorldCom, and Lucent.
Key catalysts to watch
- Formal confirmation and sizing of any Google-Meta TPU agreement, including delivery schedule and support commitments.
- TPU roadmap updates versus Nvidia's next-gen platforms, and any third-party benchmarks that quantify performance-per-dollar and total cost of ownership.
- Hyperscaler disclosures on AI capex mix (internal silicon vs. merchant GPUs), plus any commentary on networking and HBM supply.
- Evidence of broader enterprise adoption of TPUs beyond AI labs-reference customers, software toolchains, and ecosystem depth.
Portfolio angles to consider (not investment advice)
- Scenario-test 2027-2029: introduce price/mix pressure on merchant GPUs, slower unit growth, and higher opex for software differentiation; compare against upside from inference and edge deployments.
- Track suppliers levered to AI infra breadth, not just Nvidia units: high-speed networking, optical interconnect, HBM, and advanced packaging capacity.
- Watch foundry capacity signals for advanced nodes affecting both GPUs and TPUs; tightness can stabilize pricing even in a more competitive setup.
Where this could go next
If Google moves from renting to selling TPUs at scale, Nvidia's immediate demand may hold through current backlogs, but medium-term ASPs and share could face pressure. The biggest swing factor is software: how fast customers can port workloads without sacrificing time-to-train or time-to-market.
For now, Nvidia remains the default for many AI builders. But the buyer base is getting smarter and more diversified, which means procurement will be more contested and more price-sensitive over time.
Relevant resources: Learn more about TPUs from Google's official page: Google Cloud TPU.
If you build or analyze AI initiatives inside finance teams, this curated list can help you benchmark practical tools: AI tools for finance.
Your membership also unlocks: