California Isn't Regulating AI. It's Buying It Differently.
The White House spent the last four months warning states not to regulate AI. California just did it anyway without passing a single new law.
On March 30, California Governor Newsom signed Executive Order N-5-26, directing the Department of General Services and the California Department of Technology to develop mandatory vendor certifications for any company selling AI products to the state. Within 120 days, vendors bidding on state contracts must demonstrate compliance with safeguards covering illegal content, bias prevention, and civil rights protections. The state CISO can independently override federal supply chain risk assessments.
This is not legislation. It is procurement. And that distinction matters enormously.
The Loophole
A federal executive order issued in December 2025 declared that excessive state AI regulation threatens American competitiveness. The DOJ stood up a task force specifically to litigate state AI laws. Utah’s AI transparency bill was derailed after the White House issued a warning letter. Multiple states stalled or scaled back on AI legislation amid federal pressure.
But the federal order contained a carve-out: state government procurement was explicitly exempted from preemption. California read that exemption and drove a semi-trailer truck through it.
Procurement moves faster than legislation, and I know my peers in the LE community who encounter this on the administration level know this all too well. It does not require a legislative session, committee hearings, or a gubernatorial signature on a bill. It requires an executive order and an existing bureaucracy. California has both. Combined with SB 53, which took effect in January, the state now operates a two-track system: legislation for broad AI safety principles, procurement for enforceable compliance requirements.
Why Market Power Is the Point
California is home to 32 of the top 50 AI companies globally. The Bay Area alone accounts for 51% of U.S. AI startup funding. The state holds roughly a quarter of all AI patents and conference papers. Its GDP makes it the fourth-largest economy on the planet.
When California tells AI vendors they need certifications to sell to the state, it is not making a request. It is setting a floor. It’s unlikely a major AI company will build a California-only product variant and a separate product for everyone else. The compliance requirements will become the default. This is the Brussels Effect, except it is Sacramento and procurement rather than regulation.
What Law Enforcement Should Watch
The executive order’s certification requirements include three areas that directly intersect with policing.
Civil rights and civil liberties protections. Any AI vendor selling surveillance tools, facial recognition systems, or predictive analytics to California state agencies will need to demonstrate civil rights safeguards. Those same vendors sell to local agencies nationwide. The certifications will follow the products.
Discriminatory output prevention. Bias testing requirements will apply to tools used in hiring, benefits distribution, and resource allocation functions across every police department’s administrative infrastructure.
Illegal content safeguards. Deepfake detection, non-consensual imagery screening, and watermarking mandates for AI-generated images and video. For agencies dealing with digital evidence, these standards will shape the tools available on the market.
The Procurement Precedent
The federal government has historically set AI governance standards that trickle down to state and local agencies through grant requirements and best-practice frameworks. California just demonstrated that the flow can run the other direction.
If your agency purchases AI tools from any vendor that also sells to California, and statistically, it does, then N-5-26 is already part of your procurement environment. You did not opt in. The market decided for you.
The interesting question is not whether other states will follow California’s procurement approach. It is whether they need to, and what the future of this decision will be.