
Legislation seeks to address algorithm bias in state IT projects
This week, legislation that would create the Automated Decision Systems Accountability Act of 2021 passed out of the Assembly Committee on Privacy and Consumer Protection, moving a step closer to regulating artificial intelligence in state IT projects.
Assembly Bill 13 by Assemblymember David Chiu (D-San Francisco) requires decision-making on the acquisition of information technology to be supplemented with a computational process that includes data analysis, statistical models, and artificial intelligence that minimizes the risk of algorithmic bias, or the “risk of adverse and discriminatory impacts resulting from the design and application of automated decision systems.”
Existing law requires a “value-effective acquisition” of information technology goods—the careful consideration of operational cost, product quality, and other factors when awarding contracts for applications that carry a high level of risk.
According to the bill, high-risk applications are defined as any application that poses risks to personal information or legal rights of individuals, particularly those identified in the Unruh Civil Rights Act.
AB13 hinges on the idea that legal obligation of seeking a value-effective acquisition cannot be fulfilled without leveraging automated support that eliminates or reduces the risk of discrimination and bias.
The proposal says the California Department of Technology must “establish and make public guidelines for identifying automated decision systems that are subject to the bill’s requirements” on or before Jan. 1, 2023.
The bill also requires that, within ten days of awarding a contract, state agencies must submit a high-risk automated decision system accountability report to CDT, which would provide “a description of any potential disparate impacts . . . and a detailed mitigation plan for identifying and minimizing the potential for any disparate impacts throughout the contracted use of the system.”
Assemblymember Ed Chau, who first proposed AB13 in Dec, 2020, fielded questions during the Thursday hearing.
Some of AB13’s opposition expressed concerns regarding the breadth of the bill, and whether the definition of a “high-risk application” could be applied to nearly any work the state does on a daily basis.
Chau assured opponents that the bill had already been narrowed substantially and that AB13 “will likely be amended to explicitly exclude [calculators and spreadsheets] from the bill.”
Furthermore, Chau assured the opposition that AB13 would not require private companies that develop algorithms and artificial intelligence to reveal any trade secrets or proprietary information.