Pentagon Calls for AI Task Platform to Cut Back-Office Bottlenecks, Scale to 150,000+ Users

Pentagon seeks an AI task platform to unify back-office work for 150K+ users, with automation, ease of use, and ICAM/ATO security. Prove you can scale and file by March 9.

Categorized in: AI News Management
Published on: Feb 18, 2026
Pentagon Calls for AI Task Platform to Cut Back-Office Bottlenecks, Scale to 150,000+ Users

Pentagon seeks AI task management platform built for 150K+ daily users

The Pentagon has opened a call for solutions for a Joint Enterprise Task Management System (JETMS) to streamline back-office tasking at scale. The aim: reduce manual work, speed decisions, and give leaders real-time visibility into workload and status across the enterprise.

Tasking today is scattered across legacy tools, email threads, and manual trackers. That fragmentation drains time and muddies accountability. JETMS is meant to bring order to the chaos with an AI-enhanced, enterprise-grade platform.

What the Pentagon wants

The department is seeking a commercially available AI-enhanced platform to support the Office of the Secretary of War's Correspondence and Task Management System (CATMS) and the Army's Enterprise Task Management Software Solution (ETMS2). Officials described a mature, scalable, and secure capability that reduces administrative burden and accelerates decision-making across the enterprise. They noted the use of secondary names authorized by the Trump administration when referring to the Department of Defense and the Office of the Secretary of Defense.

Core AI and ML features should automate the task lifecycle end to end: intelligent intake and classification, drafting, routing, and response generation based on a repository of authorized documents. The experience must be intuitive and require minimal training.

Scale and architecture expectations

A central objective is a multi-tenant cloud architecture that can scale to more than 150,000 active daily users. Vendors should be ready to prove performance at that level, not just claim it.

Expect scrutiny on latency, concurrency, and how well the system handles spikes without degrading user experience. Clear, measurable SLAs will matter.

Security, ATO, and ICAM integration

Speed and security must coexist. Solutions should integrate cleanly with existing identity, credential, and access management (ICAM) systems and obtain a rapid authority to operate (ATO) from the Pentagon.

Data rights and avoiding vendor lock

The department intends to retain unlimited rights to its data and workflow configurations. That's a signal for managers: insist on exportable data models, open APIs, clear IP boundaries, and contract language that preserves flexibility to switch vendors.

Ask how your team can retrieve prompts, policies, workflows, and audit logs in standard formats without penalties.

What this means for management

This is an operations problem first, and a technology problem second. The winning platforms won't just check AI boxes; they'll reduce cycle times, improve response quality, and create audit-ready trails by default.

Your evaluation should focus on measurable outcomes tied to workload, throughput, and decision latency-not feature counts.

Evaluation checklist for leaders

  • Adoption: Time-to-value, required training, and user success rates within the first 30 days.
  • Workflow fit: Ability to mirror current tasking patterns, with low-friction customization.
  • AI quality: Accuracy of classification, drafting, and routing against authorized documents; human-in-the-loop controls.
  • Governance: Role-based access, audit logging, retention controls, red-teaming and bias monitoring.
  • Integration: Out-of-the-box connectors for email, document repositories, ticketing, and analytics.
  • Performance: Proven benchmarks for 150K+ daily users, including load testing evidence.
  • Security and ATO: Control mappings, documentation completeness, and prior ATO experience.
  • Data portability: Contractual clarity on data and workflow ownership; export formats and exit terms.
  • Cost model: Transparent pricing for seats, storage, API calls, and AI inference.

Implementation approach

Pilot with a high-volume directorate that represents common task patterns. Set clear baselines: task intake volume, average time-to-assign, time-to-close, and rework rates.

Roll out in waves. Use opt-in champions, short training modules, and dashboards that surface cycle time gains weekly. Keep a tight feedback loop between end users, policy owners, and the vendor.

Key date

Vendors must submit solution briefs by March 9. Managers should align internal stakeholders now-operations, IT, security, and legal-to accelerate evaluation and contracting decisions.

Further reading

For practical frameworks on evaluating and leading AI initiatives, see AI for Management.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)