Mark Zuckerberg: At CZI's Biohub, researchers want GPUs, not more space or staff
Mark Zuckerberg and Priscilla Chan are steering the Chan Zuckerberg Initiative (CZI) toward a simple priority for science: compute. Researchers at CZI's Biohub told them they'd rather get GPUs than more lab space or headcount.
On the a16z Podcast, Chan put it plainly: "We are not expanding a lot of square footage, per se, but we're expanding our compute." Zuckerberg added: "The researchers, they don't want employees working for them, they don't want space, they just want GPUs." Chan called this shift "new lab space," adding, "It's much more expensive than wet lab space."
Why GPUs are the new lab space
Biology is now inseparable from large-scale modeling. Protein design, generative models for sequences, multimodal assays-none of this runs well without serious acceleration. For many teams, the bottleneck isn't benches or desks; it's queue times and throughput on shared clusters.
- Model training and inference are becoming core methods in bioresearch, alongside wet lab workflows.
- High-throughput experiments generate data that demands tight integration with scalable compute.
- Teams trade square footage for cluster access because faster cycles beat bigger floor plans.
CZI's buildout: fewer walls, more accelerators
CZI is investing in Biohub's compute instead of expanding physical footprint. According to Chan, they operate a cluster with about 1,000 GPUs today and plan to reach "the 10,000 range" by 2028. That's a clear signal: the core lab asset is shifting from pipettes to processors.
On hiring, Zuckerberg said they'll pursue both a "network model" through added sites and a central AI team. Biohub also announced a partnership with EvolutionaryScale to apply AI to human disease research.
What this means for research leaders
- Budget shifts: prioritize accelerators, interconnects, and storage over more real estate. Plan for multi-year upgrades.
- Shared infrastructure: create institute-level queues and fair-scheduling policies; reduce idle time with mixed workloads (training + inference).
- Cloud vs on-prem: balance burst capacity with data egress, residency, and cost predictability.
- Data plumbing: invest in clean, well-annotated datasets and fast I/O; bottlenecks here erase GPU gains.
- Model choices: prefer efficient architectures and distillation for routine inference; reserve big runs for pivotal questions.
- Energy and cooling: treat rack density and power draw as first-class design constraints alongside budget.
- Grants and partnerships: write proposals that fund compute directly; pool resources across departments to hit useful scale.
- Talent strategy: offer cluster access, high-quality datasets, and technical mentorship; this can beat salary alone.
Where CZI is pointing next
CZI started in 2015 focused on education, public policy, and curing disease. In the past few years, it has shifted to a science-centered mission, and Zuckerberg says they're going "all in on AI-driven biology." The organization also listed several AI-focused roles, including senior leadership in AI infrastructure and engineering.
If you're upskilling for AI-heavy biology
For researchers mapping out their own compute and methods stack, curated learning can shorten the ramp. Explore:
Your membership also unlocks: