Google Cloud lands Lovable and Windsurf as AI coding customers, gaining ground on AWS and Azure

Google Cloud adds Lovable and Windsurf as primary customers, signaling momentum vs AWS and Azure. $50B run rate, Gemini 2.5 Pro, credits and GPUs aim to woo builders.

Published on: Sep 19, 2025
Google Cloud lands Lovable and Windsurf as AI coding customers, gaining ground on AWS and Azure

Google Cloud lands Lovable and Windsurf-here's what it means for builders

Google Cloud added fast-rising AI coding startups Lovable and Windsurf as primary customers. It's another signal that Google is gaining ground against AWS and Microsoft Azure, and that cloud is moving closer to the center of Google's strategy.

The business is still smaller than AWS and Azure and dwarfed by Google's ads unit, but momentum is clear. Google reported a $50B annual run rate for cloud, $43.2B in 2024 revenue (up from $33.1B in 2023), and says it has $58B in new revenue lined up over the next two years.

AI startups are a big driver. Google says it now works with 9 of the top 10 AI labs (including Safe Superintelligence and OpenAI) and supports 60% of the world's generative AI startups. Over the past year, Google saw a 20% increase in new AI startups choosing its cloud.

Why Lovable and Windsurf matter

Lovable and Windsurf don't spend like the largest labs or enterprises-yet. The bet is long-term growth. Both "vibe-coding" unicorns build on Gemini 2.5 Pro and run on Google Cloud infrastructure. Windsurf, now part of Cognition, also uses Gemini models inside integrations with Cognition's AI agent, Devin.

What Google is offering startups

  • $350,000 in credits via the Google for Startups Cloud Program (program details).
  • Dedicated Nvidia GPU clusters for startups in the Y Combinator accelerator.
  • Direct access to Gemini models plus managed infrastructure to train, fine-tune, and serve AI workloads.
  • Community and visibility: Google hosted its first Google AI Builder's Forum, bringing together hundreds of founders and announcing 40+ new AI startups building on Google Cloud-including Lovable, Windsurf, Sequoia-backed Factory AI, and a16z-backed Krea AI.

Why this matters for your roadmap

GPU access, credits, and model availability influence platform choice as much as raw pricing. Early commitments can define your toolchain and cost structure for years.

  • Founders: Compare credit packages and GPU availability across providers. Validate queue times, regions, and support SLAs before you commit.
  • Engineering leaders: Model training and serving costs add up fast. Forecast spend, consider reserved capacity, and standardize on managed services where it reduces operational drag.
  • Developers: Trial Gemini 2.5 Pro for coding agents and IDE flows. Check SDKs, rate limits, and integrations with tools you already use (prompt stores, vector DBs, CI/CD).
  • Risk and compliance: Confirm data residency, encryption defaults, and audit trails. Keep a contingency plan for multi-cloud or workload portability.

Market outlook

AI's compute appetite is lifting all major clouds. The global cloud market is expected to exceed $400B in 2025 and grow about 20% annually over the next five years, according to Synergy Research.

Watch for more startup logos, bigger GPU allocations, and tighter model integrations. If Google keeps stacking early-stage AI deals and backs them with capacity and credits, its cloud share will keep moving up.

Resources