U.S. Government Buys Americans' Data From Brokers, Bypassing Privacy Protections
The federal government is purchasing bulk data about American citizens from commercial data brokers, circumventing constitutional protections and federal privacy laws designed to restrict government surveillance. The Department of Homeland Security received $165 billion in yearly funding through the 2025 tax-and-spending law, with Immigration and Customs Enforcement allocated approximately $86 billion-much of it directed toward AI for Government surveillance capabilities.
This approach allows officials to access sensitive personal information-location histories, communications, financial records, health data-without obtaining a warrant or meeting the legal standards required for direct government collection. The FBI confirmed in March 2026 that it purchases Americans' location data from brokers to track citizens.
How the Data Pipeline Works
Your daily activities generate a continuous stream of data. Doorbell cameras record your movements. Your car logs your location, speed, conversations, and biometric markers like facial expressions and heart rate. Your smartphone tracks your location, health information, app usage, and communications. Retail stores scan your face and record your purchases.
Data brokers aggregate this information and sell it on a largely unregulated commercial market. The government then purchases these datasets in bulk, bypassing the Fourth Amendment's prohibition on unreasonable searches and the Electronic Communications Privacy Act's restrictions on intercepting communications.
AI systems analyze massive quantities of this data, identifying patterns and predicting behavior. DHS has funded AI platforms that acquire all 911 call center data to build geospatial heat maps for predicting crime-a form of predictive policing.
The Consent Loophole
When you buy a device, download an app, or create an account, you agree to lengthy terms of service that authorize data collection and sale. This "consent" is often the only legal justification companies cite for sharing your information with brokers.
Opting out rarely stops data collection. Tinder plans to scan users' entire camera rolls with AI. Apple and Google Pay track your purchases. Wearable devices monitor heart rate, blood oxygen, stress levels, and neurological markers-data not protected under HIPAA because tech companies are not classified as healthcare providers.
Government Surveillance Expands With Few Restrictions
The Trump administration's national AI policy framework, released March 20, 2026, encourages Congress to fund "wider deployment of AI tools across American industry" and to allow companies and academia to use federal datasets to train AI systems. These datasets contain lifelong sensitive details: biographical information, employment history, tax records.
The administration also issued executive orders to accelerate federal AI adoption and remove state-level AI regulation barriers. Simultaneously, it mandated that the federal government not procure AI models designed to detect or correct for bias-raising risks given reports of AI systems exposing sensitive data during routine operations.
The Pentagon has classified contractor Anthropic as a national security risk because the company insisted its AI model Claude not be used for mass domestic surveillance or autonomous weapons.
Legal Protections Fall Short
The Fourth Amendment requires police to obtain warrants before searching phones or tracking location data. The Wiretap Act prohibits unauthorized interception of communications. Yet courts have allowed these protections to erode by accepting company claims of user consent.
Congress has repeatedly failed to pass comprehensive data privacy legislation or restore the original intent of the Wiretap Act. The line between lawful foreign intelligence work and unlawful domestic spying has become dangerously blurred.
Restoring these laws to their intended purpose-protecting Americans' privacy in communications-and passing legislation to secure data privacy and protect against AI harms would require Congressional action that has not materialized.
Your membership also unlocks: