Autonomous AI networks accelerate materials discovery by sharing knowledge, not data

Autonomous AI labs that trade learned trends-not raw data-found better materials faster in sims. The NIMS-Tsukuba team reports a practical, privacy-friendly way to collaborate.

Categorized in: AI News Science and Research
Published on: Jan 21, 2026
Autonomous AI networks accelerate materials discovery by sharing knowledge, not data

Autonomous AI networks show faster materials discovery through knowledge sharing

A team from Japan's National Institute for Materials Science (NIMS) and the University of Tsukuba reports a practical way for multiple autonomous AI systems to collaborate. Instead of trading raw datasets, the systems share distilled knowledge-trends learned during their own exploration. In simulations, this "autonomous AI network" improved optimization speed across systems exploring different material properties. The work was published in npj Computational Materials on December 9, 2025.

Why single systems stall

Most autonomous AI systems run in isolation and target different chemistries, making raw data hard to reuse across projects. Humans get around this by sharing insights, not spreadsheets. This study brings that same pattern to machines, turning siloed pipelines into a cooperative network.

How the autonomous AI network works

  • Each AI explores a distinct material space or objective (e.g., maximizing a specific physical property).
  • Systems exchange learned trends-compressed knowledge extracted from their data-rather than the data itself.
  • An algorithm incorporates external knowledge as a reference for decision-making (transfer learning) without overwriting local models.

When three autonomous systems optimizing different properties shared knowledge spontaneously, all three reached strong candidates faster. In effect, the network raised exploration efficiency and improved sample use in simulation-based experiments.

What this means for research teams

  • Less duplication across projects working on related physics or chemistries.
  • Earlier traction in sparse-data regimes via guidance from neighboring systems.
  • Cross-lab collaboration without moving sensitive or incompatible raw datasets.
  • A pathway to scale as more self-driving labs and simulations come online.

Practical steps to try

  • Define a compact knowledge interface (e.g., embeddings, surrogate models, trend summaries) instead of sharing raw runs.
  • Weight external knowledge by domain distance and past benefit; down-weight it when it misleads.
  • Track the marginal value added by each peer system to decide which connections to strengthen.
  • Pilot with two pipelines optimizing different objectives, then expand the network incrementally.

Team, funding, and figure credit

This project was conducted by NIMS and the University of Tsukuba, led by Yuma Iwasaki (NIMS) and Yasuhiko Igarashi (University of Tsukuba), as part of JST CREST "Scientists augmentation and materials discovery by hierarchical autonomous materials search" (JPMJCR21O1).

Conceptual illustration shows researcher-to-researcher and AI-to-AI knowledge sharing; credit: Yuma Iwasaki (NIMS); Yasuhiko Igarashi (University of Tsukuba).

Citation

If you're building or upskilling a self-driving lab team, this practical track can help: AI Certification for Automation.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)
Advertisement
Stream Watch Guide