California’s SB 53: New AI Safety Bill Aims for Transparency from Big Tech
California’s Fresh Approach to AI Safety: What SB 53 Means for Big Tech
California is once again leading the charge in AI regulation with a new bill—SB 53—designed to bring more transparency and accountability to the tech industry’s most powerful artificial intelligence systems. State Senator Scott Wiener, well-known for his efforts on tech oversight, is the driving force behind this legislation, aiming to close the transparency gap left by voluntary disclosures from major AI labs.
The Road to SB 53: Lessons from a Vetoed Bill
Last year, Wiener’s earlier proposal, SB 1047, faced fierce opposition from Silicon Valley. Critics argued that holding tech companies liable for AI-related harms would stifle innovation, and Governor Gavin Newsom ultimately vetoed the bill. The backlash was so intense that tech insiders even hosted an "SB 1047 Veto Party."
This year, however, SB 53 has found more allies. Unlike its predecessor, which focused on liability, SB 53 emphasizes transparency and self-reporting, specifically targeting companies with at least $500 million in revenue. Its key requirement: leading AI labs such as OpenAI, Anthropic, xAI, and Google would need to regularly publish safety reports on their most advanced models, outlining how they test for risks like bioweapons, cyberattacks, and catastrophic misuse.
Industry Reaction: From Opposition to Endorsement
The tech industry’s response to SB 53 has been notably more positive. Anthropic publicly endorsed the bill, and Meta also voiced support, calling it "a step in the right direction." While some companies, like OpenAI, argue that AI regulation should be handled at the federal level, there’s far less open hostility than with SB 1047. The bill’s narrower focus—excluding most startups and avoiding sweeping liability—has helped it gain traction.
What SB 53 Would Change
- Mandatory Safety Reports: AI labs above the revenue threshold must publicly detail how they assess and mitigate risks in their most capable models.
- Whistleblower Protections: New protected channels for AI lab employees to report safety concerns directly to government officials.
- State-Run Cloud Resources: Establishes CalCompute, a state-operated cloud cluster to democratize AI research infrastructure beyond Big Tech’s reach.
Why Now? State vs. Federal Regulation
Despite the industry’s push for federal oversight, Wiener remains skeptical that meaningful national rules are coming soon. He points out that federal efforts have stalled, especially under the Trump administration, which has shifted focus from AI safety to rapid growth and infrastructure investment. "This is an industry that we should not trust to regulate itself or make voluntary commitments," Wiener says, emphasizing the need for state-level action.
The Stakes: Balancing Innovation and Safety
SB 53 is laser-focused on preventing the worst-case scenarios—human deaths, major cyberattacks, and the creation of chemical weapons with AI. While other bills in California tackle issues like algorithmic discrimination and AI companions, SB 53 zeroes in on catastrophic risks.
Wiener insists his goal is not to hinder innovation. "I want tech innovation to happen. It’s incredibly important," he explains. But, he adds, "We should try to make it harder for bad actors to cause these severe harms, and so should the people developing these models."
Looking Ahead: The Governor’s Decision
As SB 53 sits on Governor Newsom’s desk, the outcome remains uncertain. Wiener’s message to the governor is clear: "You wisely convened a working group that produced a very strong report, and we really looked to that report in crafting this bill. The governor laid out a path, and we followed that path in order to come to an agreement."
If signed, SB 53 could set a national precedent for AI safety transparency, offering a template for balancing innovation with public protection. For business owners and technologists, it’s a sign that state-level regulation is likely to remain a key factor in the evolving AI landscape.