News

AB 331: A lesson for future regulation of automated decision tools

Image by Jirsak via Shutterstock

Artificial intelligence is reshaping the workforce, largely by shifting decision making processes into the hands of automated decision tools. The impact of this transformation has understandably drawn calls for robust regulation of the use of AI, but to date there is little to no government oversight on the development and deployment of automated decision tools.

But not for a lack of effort.

This year, numerous bills have emerged that are aimed at regulating AI in varying capacities.  Bills still circulating through the legislature include SB 721, by Sen. Josh Becker, D-San Mateo, which would create the California Interagency AI working group, and AB 302, by Assemblymember Christopher Ward, D-San Diego, which would require the Department of Technology to conduct inventories of high-risk automated decision systems used by state agencies.

In addition, SCR 17, authored by Sen. Bill Dodd, D-Napa, would demonstrate California’s commitment to President Joe Biden’s “Blueprint for an AI Bill of Rights.” This resolution is the first in the nation to affirm commitment to the principles outlined in Biden’s AI Blueprint.

Sen. Dodd also introduced SB 313, a proposal that failed to gain traction. That measure would have created an Office of Artificial Intelligence within the Department of Technology and require state agencies to inform users when communication is conducted through generative AI.

“I’m not trying to end innovation – I’m trying to make innovation better for California.”

Another notable bill that did not make it through this year is AB 331, authored by Assemblymember Rebecca Bauer-Kahan, D-Orinda. Unlike other proposals, this bill focused on the impact of deploying automated decision tools. Automated decision tools can take in biases from the data they process, resulting in “algorithmic discrimination,” or unjustified differential treatment. A prominent example of this is in healthcare, where risk-prediction algorithms have been found to rely on flawed metrics that lead to race-based discrimination.

AB 331 aimed to prohibit algorithmic discrimination, with violations of this provision punishable by a private right of action, or the right of a private citizen to initiate a lawsuit.

In addition, AB 331 would have required developers and deployers of automated decision tools to conduct impact assessments of tools in use, or face substantial administrative fines. The bill would also have required deployers to notify subjects when an automated decision tool is being used to make consequential decisions and, if feasible, offer an alternative selection process.

AB 331 survived two committee votes before ultimately being held on the Assembly Appropriations Committee Suspense File, killing it for this year.

“I bring the gender lens to it,” said Assemblymember Bauer-Kahan when discussing her personal connection to the bill. “As a woman who’s worked in male dominated field, how do we create an opportunity for this to be better than the past? And I actually believe that I believe it’s possible. I’m not trying to end innovation – I’m trying to make innovation better for California.”

While AB 331 addressed pertinent issues, it met strong opposition from many business and tech groups who signed on to a California Chamber of Commerce-led letter that voiced concerns with AB 331’s private right of action, administrative penalties, the broad scope of the bill, and potential for confusion with other circulating bills. The letter also criticized the bill’s limited right to cure, or limited time for businesses to remedy biases in their automated decision tools to ensure compliance.

The Civil Justice Association of California, an organization that addresses legislation affecting California businesses, signed the opposition letter to AB 331. “If it’s really a problem of discriminatory application use, let’s fix that,” said Jaime Huff, Vice President and Counsel, Public Policy, for CJAC. “But when you start to monetize it and people can now make a living from it, all bets are off and it becomes a problem and another line item in the budget that gets passed on to consumers.”

Huff expressed concerns that the scope of the right to cure in AB 331 would not provide businesses with adequate time to address biases in their automated decision tools, and that the private right of action would lead to unnecessary consumer lawsuits.

“It’s frustrating because you love the purpose of what they’re trying to do,” said Huff. “It’s just the way they want to enforce it. If they want to enforce it through administrative civil enforcement like the attorney general, that’s fine – it’s not for profit.”

While regulating the prohibition of algorithmic discrimination through administrative civil enforcement may appease business advocacy organizations, consumer protection advocates, such as Hayley Tsukayama, Senior Legislative Analyst at Electronic Frontier Foundation, hold a different perspective.

“In a lot of cases, private right of action gets legislated out of bills because of industry opposition,” said Tsukayama. “They don’t necessarily want individuals to be empowered to pursue their rights in the case that they get violated. To me, if you don’t have strong enforcement in a bill, it’s just a lot of pretty words.”

Previous legislation focused on the regulation of automated decision tools has failed to gain traction in the Legislature due to similar industry opposition. Former Assemblymember Ed Chau, D-Los Angeles, introduced the Automated Decision Systems Accountability Act of 2021 (AB 13), which would have required the Department of Technology to conduct an inventory of high-risk automated decision systems used by state agencies, and for deployers to perform impact assessments of automated decision systems.

Opponents of both bills argued that the impact assessments were too broad and arbitrary, and that the bill would slow down the procurement of automated decision systems. Ultimately, the bill was gut and amended into an unrelated issue.

The need for this regulation has only become more pressing since Assemblymember Chau’s bill failed, drawing national and international attention. Assemblymember Bauer-Kahan noted that much of AB 331 was modeled after the White House Blueprint, so it is likely that the issues addressed in this bill will resurface next year, as this framework articulates that Californians must not face discrimination from automated decision tools.

“They don’t necessarily want individuals to be empowered to pursue their rights in the case that they get violated. To me, if you don’t have strong enforcement in a bill, it’s just a lot of pretty words.”

The overall goal of this bill is not controversial, as it addresses basic civil rights issues, according to Taneicia Herring, Government Relations Specialist for the California Hawaii State Conference of the NAACP, a supporting organization of AB 331.

“It has everything to do with civil rights,” said Herring. “We don’t want anybody being denied housing or employment because of their race, so we want to have those systems in place to make sure that cannot happen.”

There is no doubt amongst groups in support and opposition of AB 331, as well as Assemblymember Bauer-Kahan, that this bill will reemerge next year. What it will take for legislation that mitigates algorithmic discrimination to pass is another question, but some lessons can be taken from how things unfolded this year.

A safe bet is that the Assemblymember will reconsider regulating through a private right of action, expand on the right to cure to allow a grace period for businesses to correct the biases in their automated decision tools, and narrow the overall scope of the bill.

Regardless of the outcome, the failure of this bill illustrates a key piece of advice for those looking to regulate AI: meaningful regulation requires collaboration between legislators, industry experts, and business interests.

“What we’re trying to achieve is ensuring that there are teeth in this, that companies do the right thing,” said Assemblymember Bauer-Kahan. “How we achieve that, I think, has been open to negotiation and continues to be.”

Molly Jacoby is a Capitol Weekly intern from UC Berkeley

Want to see more stories like this? Sign up for The Roundup, the free daily newsletter about California politics from the editors of Capitol Weekly. Stay up to date on the news you need to know.

Sign up below, then look for a confirmation email in your inbox.

 

Leave a Reply

Your email address will not be published. Required fields are marked *

Support for Capitol Weekly is Provided by: