AI in 2026: A Tale of Two AIs
2026 will run on a split screen. Data centers and AGI timelines hit delays. End-user adoption keeps marching ahead.
Hype cools. Results keep showing up. The story is simple: infrastructure stalls while software and usage compound.
Where things stand going into 2026
- Big Tech demand for AI CapEx is stronger than ever.
- Google and Meta are all-in on AI. Microsoft and Amazon pulled back slightly in 2025 but are still pressing forward.
- Supply chain players are uneasy. They worry the customer's customer won't justify the buildout.
- Revenue from AI is still small relative to investment: tens of billions vs trillions over the next five years.
- Two clear winners today: coding assistants and ChatGPT-each on track for double-digit billions in revenue.
- Nearly a dozen startups are closing in on $100M+ ARR across varied use cases.
- Enterprises are struggling with in-house rollouts, leading to fatigue and disappointment.
Tale 1: The Year of Delays
Expect demand from hyperscalers to run into a supply chain that can't ramp fast enough. TSMC and ASML hold key positions and won't be rushed. Capacity is scarce, slots are spoken for, and lead times stretch. For background on lithography constraints, see ASML.
Industrial bottlenecks will bite as projects hit later stages: generators, cooling, switchgear, transformers, grid interconnects, and, critically, skilled labor. Any slip in those inputs pushes timelines out. If hyperscalers start warehousing chips instead of racking them, that's your signal delays are here.
Most AI data centers take about two years to build. 2024 was announcements, 2025 put money in the ground, and 2026 decides: either new capacity lands and compute gets cheaper, or projects miss deadlines and costs stay sticky.
There's another delay: AGI. The "AGI by 2027" chant has faded. Recent long-form interviews moved the window to the 2030s, at earliest. A good jumping-off point: Dwarkesh Patel's podcast. The risk: some of today's CapEx could age poorly if the path shifts.
Tale 2: The Relentless Drive to Adoption
Adoption isn't slowing. If anything, the best startups are scaling faster. The $0 to $100M club is real, and a $0 to $1B club will show up in 2026.
Founders are building with high efficiency-often >$1M in revenue per employee-which signals real pull from customers. Teams are using AI agents inside the company for legal, recruiting, and sales, compounding learning and output.
AI app companies ride a favorable compute cost curve as capacity arrives through 2030. Meanwhile, enterprise DIY fatigue gives specialized startups more room to win.
What this means for IT, Dev, and Research leaders
- Capacity planning: Assume delays. Diversify regions and providers. Negotiate flexible terms, failover options, and clear SLAs for GPU/TPU availability.
- Model strategy: Use smaller, fine-tuned models for most tasks; reserve large models for high-impact cases. Add caching, batch inference, and dynamic routing to cut cost per request.
- Build vs buy: If internal pilots stall, adopt proven AI apps (coding assistants, chat workflows) now. Track payback periods and unit economics per use case.
- Data readiness: Clean, label, and segment data for RAG and fine-tuning. Put evaluation harnesses in place. Guard rails for PII, source control, and audit logs are non-negotiable.
- Org design: Stand up "AI ops" across legal, recruiting, support, and sales. Measure time saved, error rates, and revenue per employee.
- Procurement due diligence: Ask vendors about construction status, interconnects, and contingency sites. Press for transparency on lead times for generators and cooling.
- Metrics that matter: time-to-first-value, % tasks augmented, cost per 1k tokens, acceptance rate of AI suggestions, model-driven revenue, incident rate.
Signals to watch in 2026
- Public reports of data center delays; energy interconnect queues lengthening.
- Rising chip inventories at hyperscalers (a hint that facilities aren't ready).
- CapEx guidance from Big Tech: acceleration vs pause, especially on AI-specific lines.
- Compute prices: do they fall in H2 or stall? That will tell you who actually came online.
- Enterprise sentiment: DIY fatigue driving more third-party adoption.
If AGI slips to the 2030s
Don't bet the company on a near-term step-change. Favor modular designs, portability across providers, and contracts you can unwind. Optimize for steady compounding of AI-assisted workflows rather than a single breakthrough.
A practical 90-day plan
- Days 0-30: Audit workloads. Pick 3-5 use cases with clear ROI. Define metrics and baselines. Lock in flexible compute options.
- Days 31-60: Pilot coding assistants across one team. Ship two chat workflows (support and internal knowledge). Add caching, guard rails, and evals.
- Days 61-90: Expand to 25-50% team coverage. Negotiate provider terms based on real usage. Publish internal best practices and training.
Keep learning and upskilling
If you're building hands-on skills or rolling this out by role, these guides can help:
Here's the bottom line: 2026 slows the build, not the usage. Plan for delays in concrete and steel. Sprint on software and workflow change. That's where the gains will come from.
Your membership also unlocks: