Federal Procurement Rules Would Weaken AI Safety Guardrails, Critics Say
A U.S. government agency is rewriting federal procurement rules in ways that would require AI contractors to disable safety features and license their systems for broad government use, according to comments filed with the General Services Administration.
The GSA, which handles federal purchasing for goods and services, proposed the changes ostensibly to steer tax dollars toward "ideologically neutral" American AI innovation. But the draft rules extend far beyond that stated goal, critics argue.
Required Licensing for "All Lawful Purposes"
The most problematic provision would require contractors to license AI systems to the government for "all lawful purposes." This language is broad enough to cover surveillance and data collection that the government could justify under loose interpretations of existing law.
The government has a documented history of finding legal loopholes to conduct surveillance and has engaged in illegal spying. Requiring contractors to hand over AI systems without restrictions removes a critical check on how those tools get used.
Removing Safety Guardrails
A second provision bars contractors from refusing government requests based on their own safety policies. In plain terms: if a company's guardrails might block a request, the company must disable them.
This conflicts with widespread concern about AI safety. Companies typically build guardrails to prevent misuse or harmful outputs. Mandating their removal in federal contracts would undermine those protections.
Broader Policy Implications
The rules also include vague "anti-Woke" requirements that are technologically incoherent and would not serve the public interest in responsible innovation.
If adopted, these provisions would become standard across every federal contract, affecting how AI for Government is procured and deployed.
Critics filed formal comments urging the GSA to withdraw the proposal and start over, arguing the current draft prioritizes government access over privacy, safety, and responsible development.
Your membership also unlocks: