How State AI Laws Threaten Innovation and Interstate Commerce Under the Dormant Commerce Clause

The dormant Commerce Clause limits states from imposing AI regulations that burden interstate commerce or regulate out-of-state conduct. States should focus on local harms while Congress sets national AI rules.

Published on: Sep 03, 2025
How State AI Laws Threaten Innovation and Interstate Commerce Under the Dormant Commerce Clause

The Dormant Commerce Clause and AI Regulation

The U.S. Constitution assigns distinct roles for Congress and states in regulating artificial intelligence (AI). Congress is tasked with overseeing the national AI market, while states are responsible for policing harmful uses of AI within their borders. The dormant Commerce Clause serves as a critical boundary, preventing states from imposing excessive burdens on interstate commerce. Crucially, the clause clarifies that Congress’s inaction does not grant states carte blanche to enact sweeping AI laws that affect the entire country.

However, many current state AI bills—such as proposals in California and New York—risk crossing the line set by the dormant Commerce Clause. These laws often impose costs that extend far beyond state borders, outweighing any local benefits. They introduce complex, costly administrative requirements that do little to enhance safety and could even degrade AI products. Moreover, they increase compliance burdens for small developers, often referred to as Little Tech, who usually lack the resources to isolate products by state or pass costs to consumers.

Balancing Federal and State Roles in AI Governance

The debate over AI governance often features two extremes. Some argue states should avoid AI regulation completely, citing the national scope of AI markets and Congress’s constitutional authority over interstate commerce. Others invoke the Tenth Amendment, claiming states have not only the power but the duty to fill federal regulatory gaps with broad AI laws. Neither extreme fully captures the Constitution’s nuanced division of regulatory authority.

Congress should focus on national AI rules, and states should concentrate on addressing harmful in-state uses, such as fraud, civil rights violations, and consumer protection. This balance supports U.S. global leadership, fosters competition between startups and larger firms, and protects consumers. It also means Congress does not hold exclusive AI regulatory authority, nor can states interpret federal inaction as permission to impose restrictive local standards nationwide.

The Supreme Court’s Guidance on State Regulation

The Supreme Court recognizes that states can set local standards reflecting community preferences when regulating residents and businesses within their borders. Such diversity in regulation is a valued feature of the constitutional system. However, the Court also warns against economic fragmentation among states, a problem the Framers sought to avoid.

Recent state AI bills sometimes explicitly cite the absence of federal AI legislation to justify imposing broad rules affecting developers nationwide. In 2024 alone, states proposed over 1,000 AI-related laws, many of which would impose significant costs on out-of-state AI developers while providing unclear benefits to residents. These costs disproportionately burden smaller firms, threatening the competitive environment critical for innovation.

The Dormant Commerce Clause Explained

The Constitution empowers Congress to regulate interstate commerce, and courts have long interpreted this grant to implicitly limit state regulation. The dormant Commerce Clause restricts state laws that interfere with the free flow of goods and services across state lines—even when Congress has not legislated on the issue.

If states had unchecked authority, individual states could dictate national AI product standards, fragmenting the market. The dormant Commerce Clause ensures that state autonomy does not compromise the national interest in free commerce.

Key Principles of the Dormant Commerce Clause

  • Anti-discrimination: States cannot favor in-state interests over out-of-state competitors.
  • Anti-excessive burden: States cannot impose burdens on interstate commerce that are clearly excessive relative to local benefits.
  • Anti-extraterritoriality: States cannot regulate conduct occurring wholly outside their borders.

For AI regulation, the anti-excessive burden and anti-extraterritoriality principles are most relevant. Few AI laws explicitly discriminate against out-of-state actors, but many impose burdens that extend beyond state borders.

The Anti-Excessive Burden Principle

Originating from the 1970 case Pike v. Bruce Church, Inc., this principle strikes down state laws that impose burdens on interstate commerce clearly excessive compared to local benefits. For example, laws requiring specific safety features on interstate vehicles have been invalidated when the burden outweighed benefits.

In National Pork Producers Council v. Ross (2023), the Supreme Court upheld a California law banning sale of pork from pigs confined in cruel conditions, finding the burden on commerce insufficient to trigger the dormant Commerce Clause. The Court noted that producers could comply by aligning operations with the law, segregating production, or opting out of the California market altogether.

The Anti-Extraterritoriality Principle

This principle limits states from enacting laws controlling conduct occurring entirely outside their borders, even if that conduct affects in-state commerce. Some courts treat this as part of the anti-excessive burden analysis, but others apply it as a standalone test.

For instance, the Eighth Circuit struck down a Minnesota law regulating drug prices for products eventually sold to Minnesota consumers because it controlled out-of-state transactions.

Factors Courts Consider in Dormant Commerce Clause Cases

  • Whether the law imposes excessive costs on interstate commerce relative to in-state benefits.
  • Whether the law regulates conduct wholly outside the state.
  • Whether non-economic harms, such as reduced product quality, are involved.
  • Whether the law disrupts the flow of interstate commerce, similar to burdens on transportation.
  • Whether compliance options exist, such as exiting the state market or segregating operations.

State AI Policies and Commerce Clause Concerns

States have increasingly taken the lead in technology regulation. In 2024, states enacted 238 tech laws compared to one federal law; for AI specifically, states passed over 100 laws while Congress passed none. Many aim to address harmful AI uses within the state and raise no dormant Commerce Clause issues. For example, New York’s proposed law holding chatbots liable for impersonating licensed professionals applies clearly within state borders.

However, other laws extend well beyond their states. California’s AB 1018 would require costly “performance evaluations,” third-party audits, and compliance officers for AI developers, potentially costing hundreds of millions annually just for local government agencies. The bill lacks limits restricting its application to California-based developers or sales, effectively setting national standards.

New York’s Responsible AI Safety and Education (RAISE) Act, pending governor action, mandates safety protocols, detailed testing disclosures, and safeguards against critical harms. While it limits application to development or deployment “in whole or in part” within New York, its reach could extend to out-of-state developers whose models are deployed in New York by third parties.

Colorado’s SB 205 imposes procedural requirements related to “algorithmic discrimination,” with no provision limiting application to in-state conduct or sales. After enactment, concerns led the legislature to delay the law’s effective date and the governor to support a federal moratorium on state AI laws.

Two Main Dormant Commerce Clause Issues with State AI Laws

  • Extraterritorial costs: These laws impose burdens on interstate commerce that may outweigh local benefits.
  • Impact on Little Tech: Small developers have limited options to avoid compliance costs, unlike large firms that can absorb expenses more easily.

Why Extraterritorial Costs Matter

These laws force small AI companies to divert resources from product development to compliance. While large companies might manage requirements for safety protocols, impact assessments, and audits, startups often lack legal or policy teams. Regulatory costs add to existing barriers like data access and talent scarcity, hindering competition.

Beyond financial costs, these laws may unintentionally reduce AI safety. For example, focusing on evaluating “catastrophic” harms might divert attention from more common but impactful safety issues. Mandatory impact assessments without balancing benefits might discourage releasing valuable features due to potential risks.

AI regulations can also implicate free speech and competition. Mandatory disclosures that compel platform speech have faced First Amendment challenges. Heavy regulatory burdens often favor established firms, reducing innovation and increasing prices.

Protectionist motives may also influence some state AI laws, potentially shielding large local companies from out-of-state rivals. This regulatory capture is a known risk in policymaking and could hamper AI market dynamism.

Extraterritorial Reach of State AI Laws

Many state proposals do not limit their application to conduct within their borders. California’s AB 1018, for example, applies broadly to developers regardless of location. This means a developer in Washington might face liability under California’s law without any direct connection to California.

Even laws with explicit territorial limits, like New York’s RAISE Act, may regulate out-of-state conduct. The RAISE Act applies to models “deployed” in New York, which could include out-of-state developers whose models are used by third parties in New York. This also impacts open source developers, who cannot control downstream uses effectively. Attempts to restrict usage may run into consumer protection or antitrust issues.

Modern AI development often involves mixing models and techniques across states, making it difficult to isolate local versus out-of-state conduct. This complexity increases the risk that state laws will have broad extraterritorial effects.

Conclusion

The dormant Commerce Clause sets essential limits on state AI regulations, balancing local interests with national commerce. While states should address harmful AI uses within their borders, laws that impose excessive burdens or regulate out-of-state conduct risk constitutional challenges and could stifle innovation, especially among smaller developers.

Federal lawmakers should establish a clear AI regulatory framework to prevent a patchwork of conflicting state laws. Until then, states must respect constitutional guardrails and focus on targeted, proportional regulation that protects residents without burdening the national AI market.