Japan Bets $6.3B on Homegrown AI as SoftBank Leads Drive to Rival US and China

Japan will pour ¥1T ($6.34B) over five years into a public-private push to build its largest AI model and the compute behind it. SoftBank is in, with work kicking off FY2026.

Categorized in: AI News IT and Development
Published on: Dec 23, 2025
Japan Bets $6.3B on Homegrown AI as SoftBank Leads Drive to Rival US and China

Japan commits ¥1 trillion ($6.34B) to build a domestic AI foundation model

Japan is putting serious money behind homegrown AI. The government plans to fund a new public-private company with around ¥1 trillion over five years to build the country's largest foundation model and the infrastructure to support it.

About 10 firms will found the company, including SoftBank Group. Roughly 100 engineers from SoftBank and Preferred Networks are expected to staff the effort early. The support program is slated to start in fiscal year 2026, beginning next April.

Why Japan is doing this

  • Compete with the U.S. and China on core AI capability tied to national security.
  • Secure compute: government backing may help with semiconductor procurement during global shortages.
  • Build domestic capacity: fund data centers, reduce the digital divide, and keep critical AI talent and IP onshore.
  • Push "physical AI": tighter integration of models with robotics, an area where Japan wants a stronger position.

What this means for IT and development teams

Expect a Japan-first stack to emerge: foundation models tuned for Japanese language, local regulations, and domestic industry data. That can mean lower latency via regional data centers and clearer compliance for sensitive workloads.

Vendors should anticipate procurement cycles that favor domestic AI, chips, and cloud capacity. For engineering orgs, plan for integration with new APIs and toolchains as this ecosystem comes online.

Timeline and structure

  • Five-year government support starting FY2026.
  • Newco formed by ~10 firms; SoftBank among the anchors.
  • Initial technical core: ~100 engineers from SoftBank and Preferred Networks.
  • Mandate: develop the country's largest foundation model and the compute to run it.

Practical takeaways for builders

  • Data readiness: curate Japanese-language and industry-specific datasets with clear governance for future fine-tuning.
  • Infrastructure: prepare for model hosting on Japan-based clouds or colo; revisit latency, data residency, and energy budgets.
  • Security: map AI supply-chain risks (chips, frameworks, model weights) and align with likely government guidance.
  • Robotics: evaluate pipelines that combine perception, control, and LLM planning if you build for warehouses, logistics, or manufacturing.
  • Hiring and skills: upskill teams on foundation-model ops, evaluation, and safety benchmarks to shorten adoption time when the stack ships.

What to watch next

  • Cabinet approval of the basic AI plan and funding release schedule.
  • Semiconductor supply deals and domestic data center incentives.
  • Early model previews, API access, and participation programs for enterprises and universities.

Level up your team

If you're planning pilots or want to pressure-test your AI roadmap before Japan's models land, explore hands-on programs and certifications focused on foundation models and MLOps. A good starting point is the latest curated training here: Complete AI Training - Latest AI Courses.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)
Advertisement
Stream Watch Guide