By: Bre Timko and Dave Schmidt
California’s Assembly Bill (AB) 2930 was introduced by the California State Assembly Privacy and Consumer Protection Committee in February 2024 and, if signed into law, will impose certain requirements on entities doing business in the state and using automated decision tools (ADTs) to make or play a substantial role in making consequential decisions. The law defines a “consequential decision” as one that has a significant effect on an individual’s life relating to 1) employment, 2) education and vocational training, 3) housing, 4) essential utilities, 5) family planning, 6) adoption or reproductive services, 7) healthcare or health insurance, 8) financial services, 9) certain aspects of the criminal justice system, 10) legal services, 11) private arbitration, 12) mediation, or 13) voting. AB 2930 replaced AB 331, which was also focused on increasing transparency related to the creation and use of ADTs.
Requirements
If passed, AB 2930 will go into effect on January 1, 2026, but will involve obligations for any covered tools used before January 1, 2025, and annually thereafter. The law will apply both to developers – those who produce or substantially modify an ADT – and deployers – those who use ADTs.
Below, we break down the requirements of AB 2930 into three components: 1. the impact assessment, 2. the notices, and 3. the publishing requirements.
1. The Impact Assessment (22756.1)
The impact assessment – or “bias audit” as it is often referred to in other AI laws – would need to be performed by developers and deployers on any ADT before the tool is first deployed, and then annually thereafter. An impact assessment must also be performed by the enactment date for any ADT used prior to January 1, 2025.
The impact assessment would need to include all of the following in order to comply with the law:
- A statement of the purpose of the ADT and its intended benefits, uses, and deployment contexts.
- A description of the characteristics the ADT will assess, the method used to measure the characteristics, how the characteristics are relevant to the consequential decision made by the tool, the outputs of the ADT, and how the outputs are used to make, or be a substantial factor in making, a consequential decision.
- A summary of the categories of information collected by the ADT, including personal information, sensitive personal information, and information related to the receipt of sensitive services (all of which are defined in California’s Civil Code).
- A statement of the extent to which the deployer’s use of the ADT is consistent with the statement the developer is required to provide to the deployer.
- An analysis of adverse impact on the basis of sex, race, color, ethnicity, religion, age, national origin, limited English proficiency, disability, veteran status, genetic information, or any other classification protected by state or federal law.
- A description of the safeguards implemented to address any reasonably foreseeable risks of discrimination.
- A description of how the ADT will be used to make, or be a substantial factor in making, a consequential decision.
- A description of how the ADT has been or will be evaluated for validity, reliability, and relevance.
Upon identification of any discrimination risk, the deployer would be prohibited from using the tool, and developers would be prohibited from making the tool available to potential deployers, until the risk has been mitigated.
The law would release deployers of the obligation to conduct an impact assessment if the developer has done one and provided documentation and if the deployer does not make any substantial modifications to the tool or use it in a way that was not intended by the developer.
2. The Notices (22756.2; 22756.7)
In addition to conducting an impact assessment, AB 2930 would require deployers to notify any individual subject to a consequential decision by an ADT that an ADT is being used. Deployer notice would need to contain information about the purpose of the tool, contact information for the deployer, and a description of the characteristics evaluated by, outputs produced by, and measurement method of the ADT. Developers would also need to provide notice to deployers, which would include a description of the data used by, limitations of, and validity and explainability evaluations of the ADT. Finally, state government deployers would be required to notify the California Privacy Protection Agency of any ADTs deployed before January 1, 2025.
Furthermore, the law would require a deployer to provide an alternative, non-ADT selection process or accommodation if technically feasible.
The Publishing Requirements (22756.2; 22756.5; 22756.8)
The law would require not only that an impact assessment be conducted, but also that a summary of the impact assessment be provided to individuals subject to decisions by the ADT. The California Privacy Protection Agency can also require that a developer or deployer provide, within 30 days, any impact assessment performed in relation to the law.
Additionally, both developers and deployers would be required to make policies that summarize the types of ADTs used and how reasonably foreseeable risks of discrimination are managed “publicly available, in a readily accessible manner”.
In addition to the above requirements, the law would require both developers and deployers to establish governance programs to manage risks of discrimination from the ADT, to designate at least one employee to be responsible for maintaining the governance program, and to make publicly available a clear policy that summarizes the types of ADTs in use and how the developer or deployer manages reasonably foreseeable risks of discrimination2.
3. Penalties
A developer or deployer who violates the provisions of AB 2930 would be liable for a fine of up to $10,000 per violation, with each subsequent day considered a separate violation. Violations of algorithmic discrimination specifically would cost $25,000 per violation. Injunctive relief, declaratory relief, and attorney’s fees may also be court-awarded. Developers and deployers would be given 45 days’ written notice to cure the alleged violation before any actions for injunctive relief would be enforced.
The law does not provide for a private right of action but dictates that civil actions may be brought by the state Attorney General, a district, county, or city attorney in the jurisdiction where the violation occurred, a city prosecutor with the consent of the district attorney, or the California Civil Rights Department.
Conclusion
California AB 2930 adds to the array of state and local laws that are being proposed and enacted in the absence of a federal AI law. Colorado Senate Bill 24-205 passed in May 2024 and has received considerable attention lately as the first comprehensive legislation governing the development and use of ADTs, and California may soon follow.
As it stands today, AB 2930 currently sits in the California legislature. We hope to have a clearer picture of the future of this law by the end of the 2024 legislative session, which concludes at the end of this month.
Stay tuned to DCI Blogs for frequent updates, and be sure to check out the resources available in DCI’s Artificial Intelligence Topic Toolkit for guidance on state, local, and federal actions related to AI.