Podcast

Senator Scott Wiener: California and AI

Senator Scott Wiener delivers the Keynote at the California and AI conference on July 8, 2025. Photo by Joha Harrison, Capitol Weekly

CAPITOL WEEKLY PODCAST: On July 8, Capitol Weekly and the University of California Student and Policy Center presented California and AI, a daylong look at the Golden State’s approach to regulating Artificial Intelligence. California is a global leader in AI technology and lawmakers in Sacramento are tasked with creating legislation and regulations that will help the state maintain leadership in this emerging industry, while creating guardrails that protect Californians. Legislators have introduced over 30 AI-related bills this session, and nearly 20 AI bills were signed into law by Governor Newsom in 2024. Senator Scott Wiener led the charge both this year and last, and has emerged as one of the key voices in the discussion around AI regulation, and we present his Keynote as part of today’s podcast. His remarks begin at about the 16 minute mark.

A full transcript of his remarks is below.

1:48 Mecha-Hitler

3:15 Op-Eds

12:25 Remembering George Steffes

15:58 Senator Scott Wiener

42:23 #WWCA

Senator Wiener’s remarks were recorded live at California and AI, which was held in Sacramento on Tuesday, July 8, 2025

Introduced by Rich Ehisen of Capitol Weekly

This transcript has been edited for clarity.

 RICH EHISEN: You’re probably all familiar with Senator Scott Wiener, he has been one of the most active and engaged lawmakers when it comes to AI regulation in this state. Some of you, I’m sure, are aware of that, and if not, you’re about to become aware of it. So please, without any further ado from me, Senator Scott Wiener.

SENATOR SCOTT WIENER: Great. Good afternoon, and thank you for having me today. And for being willing to engage around AI policy, which is increasingly part of our legislative work here in the capital and in other states as well. You know, the innovation that’s happening around AI is just… it’s extraordinary. And of course, AI has been around for a long time, but the acceleration in the last few years in particular has just been really extraordinary, and I think things are moving more quickly than people had anticipated.

I’m proud that in the district I represent in San Francisco, we are the beating heart of AI innovation globally. So I am very immersed, not just as a policymaker, but as a resident of San Francisco, and so much of the innovation that’s happening.

And we’re seeing AI models facilitate and expedite the development of drugs to treat serious health conditions, to improve climate disaster planning, to boost agricultural productivity, to make transportation more efficient, and so on and so forth. Ai is already helping us to address significant challenges that we face as a community and as a society. And really, the sky is the limit in terms of the benefits that we can derive from artificial intelligence.

At the same time, as with any powerful technology, in addition to the benefits, there are risks. Openai, for example, supports …or reports that its most recent model has a “medium risk” of supporting the creation of chemical, biological, radiological or nuclear weapons. And Anthropic recently reported that its model was showing signs of deceptive behavior, including trying to blackmail one of the developers, if the developer tried to shut the model down. So we have potential to transform life for the better, and we have risks.

And so the question for us is how do we really emphasize the benefits and try to get ahead of the risks and try to reduce those risks? We have a long history of ignoring risks when it comes to technology. We’ve seen that with social media. We’ve seen it in various contexts, and then getting around to dealing with it when it’s pretty much too late and the horse is out of the barn.

We also know that AI if not, well managed in addition to safety risks could lead to massive economic displacement, because whereas technology has always impacted employment, the speed of the implementation of this technology is so much faster than anything we’ve ever seen.

“California is really well suited to lead on AI policy. We are the heartland of so much tech innovation in general, and particularly AI innovation, and we can get it done.”

When the printing press was created, I don’t know how long it took to start impacting the way people worked. I would imagine it took a very long time. Even the personal computer. Now we’re seeing change in a matter of months or years. And the question is, can we get ahead of that?

But even with all these high stakes, this technology that promises so many benefits but creates safety risks and significant employment shifts and dislocation. Even with all that, the United States Congress still can’t get it together to enact even the most minimal basic of regulations. There is still there’s no federal deepfake law. There’s no federal law against creating deepfake porn starring a 14 year old girl. This is. This is real. It’s happening.

Senator Scott Wiener delivers the Keynote at the California and AI conference on July 8, 2025. Photo by Joha Harrison, Capitol Weekly

And Congress has done literally nothing. But it’s not surprising. Congress still has not… there’s no federal data privacy law in 2025. Literally none. There’s no federal law around social media. There’s no federal net neutrality law. And so it’s not surprising that Congress has done really nothing of substance in terms of trying to get ahead of the risks that AI presents. And so that’s why it was even more horrifying than it otherwise would have been when Ted Cruz, who was a year ahead of me in law school, everyone hated him then, too. [laughter]

That the fact that he proposed this this ten year moratorium on state AI regulation that actually passed the House of Representatives… which is just I mean, there were a lot of despicable things in that bill, but this was certainly one of them… Literally, to say, we’re going to ban states from regulating AI, whether around kiddie porn or nuclear weapons or anything else. Oh, but we’re not going to regulate it, right?

Because normally when Congress preempts, it’s like we’re occupying the field. We’re going to we’re going to pass comprehensive legislation, and then we don’t want to have the states undermining that or creating their own patchwork. They weren’t doing that. It was just we’re going to just ban the states from doing that. It was one of the most irresponsible things I’ve seen in a long time.

And fortunately, I want to really give a shout out to the advocates and academics and some people in industry… I thought it was pretty gross that some people in industry were pushing hard for that federal moratorium without also pushing for federal regulation. But they did. But there was a big coalition against it, and it was voted down 99 to 1.

So that was a huge win. So now we have to do our job in California. And California is really well suited to lead on AI policy. We are the heartland of so much tech innovation in general, and particularly AI innovation, and we can get it done. This is a state that passed strongest data privacy law in the country. We were able to pass a net neutrality law, which I had the honor of authoring back in 2018. And we can do it here as well. And we have passed some good laws addressing risks around AI. Around deepfakes and some other issues as well. And we need to keep moving in that direction.

So last year, for those who follow these things you may recall that I authored Senate Bill 1047 to require the largest AI labs developing the largest AI models to basically engage in a safety evaluation of their models before releasing them and be able to shut them down. It was really not much more than what all of the large labs had all already committed to publicly over and over and over and over and over again. They all committed that they were going to perform these safety evaluations. But once we put it into a bill, there was a lot of opposition.

And that, you know, you have to always wonder because self-regulation has its limits. And so that bill, I’m really grateful to my colleagues. We put it on the governor’s desk and he vetoed it.

But the governor didn’t just veto it. He made clear that he thought that we did need to do something, that he just didn’t support what was before him. And so he convened a a working group of some of the top minds in AI, including Professor Fei-Fei Li, who was one of the most vocal opponents of SB 1047.

“In addition to innovating here and creating great new technologies we should also be producing here as well in manufacturing, in that advanced manufacturing space.”

That working group put out its report about a month ago, and we are now working on incorporating aspects of that report into a bill that I am authoring this year, SB 53, which currently does two things.

It provides whistleblower protections to employees at AI labs if they need to disclose that there is some sort of significant risk that they are seeing. And it also creates something called Cal Compute a public cloud to try to democratize access to compute, which is incredibly expensive for startups, academic researchers, etc..

And so we are looking at including aspects of the of the working groups… the governor’s working group report into that bill, particularly around transparency of safety protocols by the labs. We… and I’m excited about this bill for a couple of reasons…. First of all, California should be leading in terms of innovation around AI safety. That is an area where California can lead. We should be leading on Cal Compute. As we see industrial policy come back to the US, where we have bipartisan support for industrial policy, the government should be partnering with our universities and with the private sector to really provide access. And that’s what Cal Compute will do.

And this is all, of course, also in in the wake of the permitting reform legislation that we just passed as part of the budget, reforming… beginning to reform. CEQA, The California Environmental Quality Act, which has often been used to slow down or stop projects and progress for reasons having nothing to do with the environment. And so we made some reforms there. And one of the things we did in, in that legislation was to exempt advanced manufacturing from CEQA.

So in addition to innovating here and creating great new technologies we should also be producing here as well in manufacturing, in that advanced manufacturing space.

When the CHIPS Act was passed, it more or less skipped over California in terms of the manufacturing piece of it, that there were some small pieces of the CHIPS Act that made it to California, but not the actual manufacturing piece, because it’s too hard to actually set up shop here. And so by exempting advanced manufacturing from CEQA, I think we can help to make sure that California is not just creating the technology, but actually building it as well.

Senator Scott Wiener delivers the Keynote at the California and AI conference on July 8, 2025. Photo by Joha Harrison, Capitol Weekly

So, in so many ways, California has led on privacy, on net neutrality, on various other technological advances, and we should be leading here as well. And leading is not just about leading on the innovation. It’s also leading on responsible, smart public policy. For the folks who are concerned about a patchwork of state laws around AI…. and I’ll be the first one to say, I think it would be preferable for the federal government to do it, but as I mentioned earlier, you know, if you think that’s going to happen, I have a bridge to sell you. And so when industry says don’t do it at the state level, do it at the federal level. You know, you have to ask yourself what’s going on here because it’s not happening at the federal level.

And until that happens states do have a responsibility. And I think California can set… create the template and lead the way. I agree we should try to avoid a patchwork of state laws. And so having a law in California, I hope can really send a signal to other states, not that we want to be bossy and tell them what to do…. But we do sometimes get it right in, in California.

So I do just want to read from the governor’s working group report: “evidence that foundation models contribute to both chemical, biological, radiological and nuclear weapon risks for novices and loss of control concerns have grown even since the release of the draft of this report in March 2025. Frontier AIcompanies… their own reporting reveals concerning capability jumps across threat categories.”

So now is the time for us to act. And we’re working very hard to do that. So stay tuned. This week and next week we’ll have some good news. And again, thank you for having me today and I look forward to continuing to have California lead. Thank you.

RICH EHISEN: All right. Well we have about ten minutes or so where we can take some questions. I’m sure somebody has something they would like to ask Senator Wiener. So, yeah, there goes Tim.

MOLLY DUGAN: Hello. Thank you so much for being here, Senator. I have a question. You had mentioned during the previous panel. They talked a lot about AI literacy. Are there any plans underway to have new programs for K-12, community college, CSU, or UC programs to infuse AI literacy in public education?

SW: Yeah, I know that’s definitely a focus. And it’s incredibly important. We… and it’s literacy in terms of education. But also I think it’s just literacy for the public at large about being able to know, like, what’s real and what’s not real and knowing how to navigate AI in a, in sort of a responsible way. So I do think that’s important.

We’ve… I think we have struggled in California a bit in terms of STEM education generally. And there are some great there’s some great work happening in local communities. But we definitely have to do have to do more.

IRVIS OROZCO: Hello. My name is Irvis Orozco, Sacramento Valley Informador. Right now as we’re seeing, the federal government is linking up with AI to control or have access to information that Californians should be private. Specifically, right now, because of Border Patrol and everything that’s going on, they are using AI in order to locate people. How is California going to keep information from immigrants private? And is there something that the legislature and the governor is doing to ensure that all that information specifically now that a lot of immigrants are being targeted?

SW: Yeah, I mean, that’s very, very real. And we saw it starting out with with DOGE in terms of infiltration of all of these databases and sharing of IRS and Social Security data. So what you know, Elon [Musk] and Trump have had a parting of the ways, but we have to always remember the extreme harm that Elon Musk and everyone on his team have done in terms of the mining of data making a lot of people less safe and particularly immigrants, and basically sending a message that if you pay taxes, participate in the system, your data is going to be transferred over to the secret police that used to be known as ICE.

“This is why we can’t have nice things in California. Because there’s always a concern that something bad might happen. And therefore we have to make it hard for anything to happen. That’s how we’ve done it in California.”

And so in California, we are we have been tightening our laws in terms of protecting data privacy. There are obviously limits to what we can do when you have the federal government involved. But we’ve been both on immigrants around trans people and access to health care. We’ve been trying to really tighten data privacy protections because it’s incredibly important.

And they are… I think we need to you know, call it what it is. They’re creating a police state. Not just around immigrants, but first and foremost immigrants, but really around everyone. They just put in a $45 billion slush fund for their secret police that …to build detention centers and other needs. And so what’s happening in LA is going to be spreading. And it’s a police state. It’s a surveillance state. It’s so much worse than I think people imagined. And I will supercharge it.

And we also need to be looking at who are the companies who are contracting with this new federal law enforcement apparatus. You know I think we know what Palantir is doing. But there are other companies as well. And there needs to be accountability for companies that are in Sacramento lobbying the legislature while they are helping to build this horrific police state that is tearing families apart and just doing deep harm to our communities.

RE: Senator, I’ve got a question here. You mentioned CEQA reform. I know some of the people that were adamantly opposed to that… your bill and how it got incorporated into the budget…. Particularly over the manufacturing. You mentioned large manufacturing. And I think there’s a lot of fear that that is going to allow, you know, facilities to be built, you know, in, particularly in lower economic means communities. And they will not have any ability to stop that from happening. Maybe if you could talk a little bit about in more detail what the CEQA reform is going to mean in that regard.

SW: This is why we can’t have nice things in California. [laughter]  Because there’s always a concern that something bad might happen. And therefore we have to make it hard for anything to happen. That’s how we’ve done it in California.

We legislate for the worst case scenario. Right? That because something …and it’s always true… Something bad can be built. You can never you can never, you know, reduce the risk to zero of something. Some bad development happening. But because something bad might happen. We have to make it impossible to do anything good.

That is why we have such a deep housing crisis. It is why we don’t have enough public transportation. It is why Texas and Florida both produce way more clean energy than California, even though we are a bigger state and they don’t believe in climate change. So they are smaller states that don’t believe in climate change, and they are both producing way more. Actually, Texas is way more in Florida just leapfrogged us…. More clean energy than California because California, well, we we believe that if something bad can theoretically happen, then you can’t. We shouldn’t be allowed to build anything.

And that’s really what’s driven California’s policies around permitting and CEQA, etc. for the last 50 years. And it is and it is driving the state into a ditch. And we’re trying to turn that around. So in terms of advanced manufacturing…. and when you saw if you watched the process around the CEQAreform that we did in the budget the level of the shrillness of it was really extraordinary. And people have a right to have an opinion and oppose anything they want. But some of the statements that we heard made around a very, very I think, common sense CEQA reform, it was over the top.

On advanced manufacturing. The exemption applies on land that is zoned industrial. You can’t put it in the middle of a residential neighborhood, can’t put it in the middle of like a wetlands somewhere. It has to be land that’s zoned industrial. And the bill doesn’t change… cities still have control over whether they allow manufacturing in their city where they allow it, whether they have conditions on permits. So it doesn’t take away any of the local permitting or if you have to get a permit from Cal EPA or the Air Resources Board.

Senator Scott Wiener delivers the Keynote at the California and AI conference on July 8, 2025. Photo by Joha Harrison, Capitol Weekly

All we’re saying is, let’s let it go through the normal government process at the state and local level, without it being that anyone who has enough money to hire a lawyer can tie you up for five years under CEQA. That’s what that’s the problem that CEQA has created, that if you have money, if you’re organized, you have money to hire a lawyer. You can tie any project up in knots for years and years. Whether that is a food bank in the city of Alameda  – got tied up in a school. Also child care center in Napa. A bike lane plan in San Francisco. Clean energy. Five hundredapartments right by a BART station in downtown San Francisco that they wanted to build on a parking lot, all got tied up in CEQA. That’s government at its absolute worst.

CATHERINE BAKER: Senator what’s next in the CEQA frontier? So what would you see as any are there any additional changes that might be in round two down the road? Obviously, it’s one of the trickier things to ask two weeks after, maybe a week after a bill of major changes happened. Is there more to do in that area or. And if not, what in either the housing or advanced manufacturing do you think might still be on the list of things to do that we can talk about at a future conference?

CEQA: Yeah. I mean, the two budget bills that we passed, AB 130 and SB 131 were quite significant in terms of reforms to the statute itself and then creating a number of exemptions, including around all infill housing. So it was significant and it took a lot. It took, you know, the governor throwing down and saying, let’s do this in the budget and the leadership in both houses agreeing to that. And I’m really grateful to the Pro Tem and to the Speaker and to the Governor and to Assemblymember Wicks, who was my partner in the endeavor for moving in this direction.

You know, we there are always CEQA bills in the legislature. I have another bill moving forward this year to make permanent an exemption on sustainable transportation projects, for example. And so, you know, there’s always ideas, but this was a pretty big step forward.

Great. I think I can do one more.

RE: We have time for one more question. Oh, hand right over here.

ZIMA CREASON: Hello, I’m Zima Creason. I’m the executive director of the California EDGE Coalition and I also serve on the San Juan Unified Board of Education. And so my question really is related to are there things local policy folks such as myself can be doing to help you advance our shared goals related to AI, data privacy… All the things that we spoke about earlier today? Could we be doing more? Should we be asking different questions? Are there local policy efforts that we should be advancing again to advance our shared goals?

SW: Yeah. No, I think it’s great when local governments get involved and advocate in a proactive way for policies, sometimes in in the legislature, we see the if there’s a perception that local governments get involved when they’re opposing things. And that’s certainly true, and everyone has a right to oppose things. But getting involved in a proactive way is great as well.

And you know, when we have local elected officials, city council members, school board members who testify as lead witnesses in favor of a bill. So, for example, if there were an AI bill where you have a school board member saying this is how this is impacting our students right now, and that’s why this bill is important. It’s just a very powerful voice. And so that’s just one example. Yeah. Yeah. Great. Well, thank you so much, everyone. Appreciate it.

RE: All right. Thank you so much, Senator.

Thanks to our sponsors:

THE TRIBAL ALLIANCE OF SOVEREIGN INDIAN NATIONS, WESTERN STATES PETROLEUM ASSOCIATION, KP PUBLIC AFFAIRS, PERRY COMMUNICATIONS GROUP, CAPITOL ADVOCACY, THE WEIDEMAN GROUP, CALKIN PUBLIC AFFAIRS, STUTZMAN PUBLIC AFFAIRS, LUCAS PUBLIC AFFAIRS and CALIFORNIA PROFESSIONAL FIREFIGHTERS

Want to see more stories like this? Sign up for The Roundup, the free daily newsletter about California politics from the editors of Capitol Weekly. Stay up to date on the news you need to know.

Sign up below, then look for a confirmation email in your inbox.

 

Leave a Reply

Your email address will not be published. Required fields are marked *

Support for Capitol Weekly is Provided by: