UT Austin Leads in Digital Twins: Physics-smart AI, Gordon Bell-winning tsunami forecasts, and Horizon-scale computing

UT Austin is pushing digital twins that fuse physics and AI for fast predictions in energy, health, and hazards. Sub-second tsunami forecasts even won the Gordon Bell Prize.

Categorized in: AI News Science and Research
Published on: Feb 28, 2026
UT Austin Leads in Digital Twins: Physics-smart AI, Gordon Bell-winning tsunami forecasts, and Horizon-scale computing

Pioneering AI for Science: Why UT Is a Digital Twin Powerhouse

Research | Feb 27, 2026

After a decade of advances in AI, mathematics, and supercomputing, The University of Texas at Austin is setting the pace in digital twins-physics-informed, AI-driven models that deliver fast, high-fidelity predictions for energy, health care, national security, and natural hazard mitigation.

What makes this different from conventional simulation is speed and trust. UT teams are building twins that learn from live data, obey the laws of physics, quantify uncertainty, and answer real "what if" questions in time to act.

From research to real-time: tsunami forecasts in a fraction of a second

Working on the high-risk Cascadia Subduction Zone, UT-led researchers built a digital twin that issues high-accuracy tsunami forecasts in under a second-roughly a 10-billion-fold speedup over prior methods. The breakthrough earned the 2025 ACM Gordon Bell Prize.

The team fused seafloor pressure data with physics-based wave models and executed novel algorithms across elite supercomputers, including El Capitan, Perlmutter, and TACC's Frontera. Results compress what once demanded decades of compute into moments that can save lives along a coastline with significant seismic risk. For background on the fault system, see the USGS overview of the Cascadia Subduction Zone.

What a digital twin must do to be useful

A digital twin is a living model of a physical system, synchronized by real sensor data. Unlike a one-off simulation, it updates continuously, predicts forward, and informs decisions under uncertainty.

  • Embed governing physics (heat, fluids, structures) to answer "what if" with confidence.
  • Assimilate data in real time to stay calibrated and detect regime shifts early.
  • Quantify uncertainty to support high-consequence decisions.
  • Use reduced-order surrogates that are fast but grounded in first principles.
  • Run at HPC scale when fidelity, ensembles, or rapid response are required.

The Oden advantage: foundations first

Under Karen Willcox, the Oden Institute is building the mathematical backbone of predictive digital twins-scientific machine learning plus reduced-order modeling, with rigorous uncertainty quantification for real-time use.

Oden leads the Department of Energy MMICC on Multifaceted Mathematics for Predictive Digital Twins (M2dt), directed by Omar Ghattas, with collaborators across Sandia, Brookhaven, Argonne, and MIT. The center develops new mathematical and statistical frameworks, ML methods, and algorithms that fuse physics and data for complex energy systems.

A Department of Defense MURI at Oden targets scalable uncertainty quantification for aerospace and defense twins. Researchers are also partnering with the Texas Institute for Electronics to prototype a digital twin of semiconductor manufacturing. "Combined with the advanced packaging capabilities TIE is building, this work can drive faster co-optimization and help bring next-generation compute platforms from concept to reality more quickly," said Mark Papermaster, CTO and executive vice president of AMD.

Field-ready twins across domains

The Oden Institute's scientific ML engines don't just fit data-they respect physics. Its annual workshop convenes leading mathematicians and engineers to push AI for Science forward.

In health care, the Center for Computational Oncology, led by Thomas Yankeelov, develops biophysical tumor models to optimize and personalize treatments. For coastal risk, a team led by Clint Dawson builds hurricane storm-surge twins to guide evacuations and resource staging.

TACC: the computational engine

UT's Texas Advanced Computing Center provides the hardware to run complex twins at scale-Frontera, Vista, and Stampede have already supported major wins. The next wave is near.

TACC will host the NSF Leadership-Class Computing Facility featuring Horizon-about 10x more powerful for scientific simulations and roughly 100x for AI versus Frontera, with 4,000 NVIDIA Blackwell GPUs and 1 million CPU cores. Coming online in spring 2026, Horizon will enable richer physics, tighter uncertainty bounds, and smarter decisions for larger, more intricate systems.

How the tsunami twin works-fast, accurate, actionable

Researchers discretized the ocean with a high-resolution mesh to track energy transfer from fault rupture to coastal inundation. Seafloor pressure sensors feed the model, which runs physics-consistent predictions in split seconds.

"AI for Science differs from commercial AI because it does more than just find patterns; it reflects the laws of nature," said Ghattas. "By learning from data through the lens of physics models, we can exploit the structure of wave propagation models to overcome the sparsity of data while still issuing accurate forecasts with rigorously quantified uncertainties."

Nuclear twins: accelerating safe innovation

With $18 million in state support, mechanical engineering associate professors Kevin Clarno and Derek Haas are using operational reactor data to predict conditions, validate safety, and speed licensing for advanced nuclear technologies.

The team streams high-frequency telemetry from UT's 1-MW research reactor to TACC systems, reconstructs past behavior, and forecasts what happens next. Real-time models run alongside operations, using new instrumentation to guide experiments and improve daily performance.

Progress comes from UT's interdisciplinary culture-nuclear engineers, operators, instrumentation experts, data scientists, and HPC specialists working in a feedback loop. Students are embedded across the full pipeline, from sensors to models to decisions.

Practical takeaways for research teams

  • Start with physics. Let governing equations shape architectures, features, and constraints.
  • Plan data assimilation and uncertainty quantification at the outset; don't bolt them on later.
  • Co-design HPC and models. Validate reduced-order surrogates against full-fidelity runs.
  • Instrument aggressively. High-frequency, high-quality data is the lifeblood of a live twin.
  • Engineer for sparse, noisy, and failed sensors-especially in hazards and defense.
  • Integrate operators and decision-makers early so outputs match real decisions and timelines.

For broader context on methods, tools, and workflows across labs and HPC environments, see AI for Science & Research.

Looking ahead

From sub-second tsunami warnings to reactor twins that speed nuclear innovation to semiconductor process twins for national defense, UT is pushing digital twins into high-impact territory.

With Horizon coming online in 2026, continued investments in math and algorithms through Oden, cross-campus collaboration, and deep partnerships with national labs and industry, UT is positioned to expand digital twin applications across defense, energy, health care, and natural hazards-advancing how teams anticipate, respond to, and overcome critical challenges.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)