Ai2 brings $152M computing cluster online for open scientific AI models
The Allen Institute for AI has activated a new computing system funded by Nvidia and the National Science Foundation, marking the first major milestone in a $152 million effort to build open AI models for scientific research.
Ai2, the Seattle-based institute, received the funding last August through the White House AI Action Plan. The project, called Open Multimodal AI Infrastructure for Science, targets fields including materials science, biology, and energy.
The computing cluster sits outside Austin and runs on Nvidia's Blackwell Ultra chips, managed by Cirrascale Cloud Services. Noah Smith, Ai2's senior research director and principal investigator on the project, called it a "critical step" in keeping advanced AI development accessible to the broader research community.
What sets Ai2 apart
Ai2 releases the full code, data, and training methods behind its models-a practice most large-scale AI projects don't follow. This approach lets other researchers reproduce and build on the work.
The institute has already used the new infrastructure to upgrade its Molmo and OLMo model families. Recent releases include a multimodal model capable of video understanding and a more efficient language model architecture.
Current research priorities
Ai2 is now focused on building unified models that handle multiple types of data, developing AI agents, and working directly with scientific communities to ensure models solve real-world research problems.
The announcement comes as Ai2 rebuilds after losing its CEO and several top researchers to Microsoft in March. Interim CEO Peter Clark said this week that the institute remains committed to open models, longer-term research, and applied AI in scientific discovery and environmental science.
Researchers interested in accessing these tools and understanding how open models are built may benefit from Generative AI and LLM Courses and AI Research Courses.
Your membership also unlocks: