GSA Sets 2026 AI Agenda as War Department Rolls Out GenAI.mil
The US General Services Administration (GSA) has set a clear target for 2026: help agencies move faster with AI, remove blockers, and promote collaboration across government. In parallel, the Department of War has launched GenAI.mil, a secure generative AI tool now available on its non-classified network.
What GSA Is Doing Right Now
GSA has started testing chatbots that answer common questions about federal programs. It's also using AI to draft market research summaries and is piloting built-in generative features to boost employee productivity.
Fixing the Bottlenecks: FedRAMP 20x
Slow and costly authorizations have held back AI adoption. In 2025, changes to the government-wide FedRAMP program led to "FedRAMP 20x," a new assessment and authorization path.
According to GSA, FedRAMP 20x is expected to remove the requirement for agency sponsorship of AI cloud offerings before they enter the authorization pipeline. This could streamline AI procurement across agencies and widen access to critical services. For the public, that should mean easier access to benefits, faster answers, and quicker support during disasters.
Test Before You Buy: USAi and the AI Action Plan
The administration's AI Action Plan (July 2025) set a goal for agencies to test AI models before purchasing-at no cost and without security risk. The USAi platform, launched in August 2025, is the first shared federal environment for AI experimentation. GSA says it will speed adoption, build smarter infrastructure, and coordinate federal action.
War Department Launches GenAI.mil
The Department of War-formerly the Department of Defense-introduced GenAI.mil, a secure generative AI platform available to military personnel, civilians, and contractors on the department's non-classified network. Pete Hegseth, the secretary of war, called it "a new era" where every member of the workforce can be more efficient and impactful.
The tool runs on a specialized version of Google's Gemini for Government and is configured to handle controlled unclassified information. It includes safeguards that help prevent leaks, prompts users to verify outputs, and shows a green banner that reminds users what can and can't be shared. "AI should be in your battle rhythm every single day; it should be your teammate," Hegseth said, urging immediate adoption across daily workflows.
What This Means for Federal Leaders
- Accelerate procurement: Use FedRAMP 20x to bring AI services into the pipeline faster. Update templates with AI-specific evaluation criteria, data-use terms, and human review requirements.
- Start narrow pilots: Prioritize clear use cases-FAQ chatbots, market research summaries, document formatting, and research assistance-then expand based on performance and risk.
- Set guardrails: Define how controlled unclassified information is handled. Require human-in-the-loop review and clear labeling for AI-generated content.
- Train your teams: Run short, role-based training on prompting, verification, and safe use. Publish an acceptable-use policy and track adoption with usage analytics. If your team needs fast upskilling, explore AI courses by job.
- Build verification into the workflow: Require users to double-check outputs, route sensitive tasks through approvals, and collect feedback to improve prompts and guidance.
- Coordinate early with security and privacy: Log usage, regularly review prompts and outputs, and align with your CISO and privacy office on oversight.
- Share lessons learned: Use interagency forums and the USAi platform to exchange playbooks, metrics, and reusable prompts.
Practical Next Steps
- Identify two to three high-volume tasks for AI pilots in Q1-Q2 and set success metrics (accuracy, time saved, user satisfaction).
- Map data flows for each pilot and document what content stays out of prompts.
- Prepare a short adoption guide: approved tools, do/don't list, verification checklist, and escalation paths.
- Align contracting with FedRAMP 20x expectations and pre-negotiate data-use and audit clauses with vendors.
- Report early wins and risks to leadership to secure funding for scale-up.
Why This Matters
GSA's moves cut red tape and give agencies a safer way to test and buy AI. The War Department's rollout shows how fast a large organization can put a secure tool on every desk when guidance is clear. The path forward is focused pilots, strong guardrails, and steady training-so teams can deliver better services without slowing down oversight.
Your membership also unlocks: