By Zach Olsen and Amanda Allen
BLOG OVERVIEW:
California's amendments to its FEHA are set to go into effect October 1, 2025. Employers will need to be aware of a number of changes that will impact them, including broader regulations on EEO compliance themes and specific provisions regarding the use of automated-decision systems.
The State of California has approved sweeping new changes to existing regulations implementing its Fair Employment and Housing Act (FEHA). These amendments, targeted at employers’ use of automated-decision systems (ADS) and selection criteria (including qualification standards, employment tests, and proxies), were made to the longstanding state law which broadly mandates non-discrimination in employment. This law applies to employers with five or more employees in total located inside or outside of California, though employees outside of California are not themselves covered by the protections of the Act if allegations did not occur in California or made by decision makers in California.
Effective October 1, 2025, these changes are aimed at regulating the increasing use of artificial intelligence (AI) tools in employment-related decisions. In reality, these amendments reinforce much broader equal employment opportunity (EEO) compliance obligations, while also offering an affirmative defense for employers who engage in proactive EEO analytics and measures. Below is a detailed breakdown of the amendments and the implications for employers based on two key dimensions:
- New ADS-specific provisions that reflect California’s response to the rise in use of AI in employment decision-making; and
- Broader EEO compliance themes that remain foundational to all employment decision-making in light of changes to EEO rules at the federal level.
California’s New Rules for Automated-Decision Systems
As noted in a previous blog, there are number of changes of which employers need to be aware regarding the use of an ADS in employment decision-making.
New Definition of ADS and Related Terms
The following terms are now expressly defined under FEHA:
-
- Automated-Decision System (ADS) - a computational process—including those using artificial intelligence, machine learning, or algorithms—that make or assist in decisions about employment.
- Algorithm - A set of rules or instructions a computer follows to perform calculations or other problem-solving operations.
- Artificial Intelligence - A machine-based system that infers, from the input it receives, how to generate outputs.
- Machine Learning - A system’s ability to use and learn from its own analysis of data or experience and apply this learning automatically to future calculations or tasks.
- Automated-Decision System Data - Includes any data used in or resulting from the application of an ADS and/or used to develop or customize an ADS.
These definitions cover tools and methods that aid employment decision-making, such as:
-
- Resume Screeners
- Online Assessments and Games
- Video Interview Analysis
- Personality or Behavioral Profiling
- Targeted Job Advertising
- Third Party Data Analysis
- Automated Ranking or Scoring
Other Changes Related to Automated Tools
Explicit Non-Discrimination Mandates for ADS - The amendments extend the non-discrimination provisions of FEHA to explicitly cover automated-decision systems.
Reasonable Accommodation - Employers must provide reasonable accommodations to applicants and/or employees if an ADS evaluates factors that could impact individuals based on disability, religion, or any other protected traits (e.g., facial recognition or reaction-time tests).
Agent and Vendor Liability - California officially defines “agent” under FEHA and expands the definition of “employer” to include agents and third parties who act on behalf of employers. This includes those who deploy or develop automated-decision systems.
FEHA’s Broader EEO Compliance Framework
While the 2025 amendments focus on automated decision systems, they are nested within FEHA’s long-standing regulatory framework on employment. This framework continues to impose robust EEO obligations on employers regardless of whether they use an ADS. These requirements include:
Mandatory EEO Data Collection - FEHA mandates the collection of race, sex, and national origin data for applicants and employees.
Recordkeeping and Readiness - Employers must maintain detailed records of selection criteria, ADS data, and applicant flow logs and retain them for at least four years. These records must be available for inspection and preserved in the event of a complaint.
- The regulations explicitly state that it is lawful for employers to collect and maintain applicant flow logs and other statistical demographic data.
- In addition to the mandated collection, the regulations give employers many incentives to collect and analyze applicant flow data. For example, the regulations provide a low bar for plaintiffs to establish adverse impact. Section 11017.1(f)(3) on the consideration of criminal history states “An adverse impact may be established through the use of statistics or by offering any other evidence that establishes an adverse impact. State or national-level statistics on conviction records that show a substantial disparity based on any characteristic protected by the Act are presumptively sufficient to establish an adverse impact [emphasis added]. This presumption may be rebutted by a showing that there is a reason to expect a markedly different result after accounting for any particularized circumstances such as the geographic area encompassed by the applicant or employee pool, the particular types of convictions being considered, or the particular job at issue.”
- Demographic data must be kept separate from personnel files.
Codification of Uniform Guidelines - The regulations formally adopt the Uniform Guidelines on Employee Selection Procedures (UGESP). Under UGESP, employers must be able to prove an employment policy or practice resulting in adverse impact is job related, consistent with business necessity, and/or there is no less discriminatory standard, test, or other selection criteria that serves the employer’s goals as effectively as the challenged standard, test, or other selection criteria.
Affirmative Defense for Anti-Bias Testing: Employers may defend against discrimination claims by demonstrating proactive anti-bias testing and mitigation efforts. The amendments indicate the quality, recency, and scope of such efforts, the results of such testing or other effort, and the response to the results are relevant in employer’s defenses of discrimination claims.
Lack of Evidence of Anti-Bias Testing: Conversely, the amended regulations state that “the lack of evidence of anti-bias testing or other similar proactive efforts to avoid unlawful discrimination” will also be relevant to the adjudication of a discrimination claim. This highly incentivizes employers to conduct quality anti-bias testing/EEO analytics and make any necessary adjustments to mitigate this risk.
Recognition of “Proxy” Discrimination: The amendments introduce the concept of “proxy” characteristics—traits that are closely correlated with protected classes and may result in indirect discrimination. For example, the regulations state that an ADS that analyzes an applicant’s tone of voice, facial expressions, or other physical characteristics/behavior may discriminate against individuals based on race, national origin, gender, or other characteristics absent evidence of job relatedness.
Annual Reporting Requirements: Employers with 100 or more employees must file a California Employer Information Report (CEIR). Currently, this requirement may be satisfied by submitting the employer’s most recent EEO-1 report.
Takeaways for Employers
Audit Use of All Employment Decision-Making Tools, including Automated-Decision Systems: Identify and document all tools used in the employment decision-making process (i.e., recruitment, hiring, promotion, termination, and compensation decisions). DCI’s third-party review services and AI Audit Framework can assist in identifying areas of risk and provide recommendations for ways to mitigate this risk.
Conduct Proactive “Anti-Bias Testing”: Ensure tools used in employment decision-making are job related and aligned with business necessity when adverse impact is identified. This includes proactive work to conduct adverse impact analyses, formally validate ADS and/or selection criteria, address any identified risk areas, and prepare documentation describing these efforts. There is no publishing requirement associated with these amendments, such as what is required by NYC Local Law 144, giving employers the opportunity to work with legal counsel to conduct this work under attorney-client privilege.
Review Vendor Contracts: Ensure third-party providers and agents acknowledge employer obligations under FEHA and other applicable laws.
Ensure Proper Data Collection Protocols: These regulations require employers to collect race, sex, and national origin information and retain this information for a period of four years. Ensure demographic data is collected and stored in compliance with state and federal law.
Train HR and Legal Teams: Educate internal stakeholders on both the ADS and EEO implications.
California’s ADS amendments represent another significant development in emerging AI governance by including them under the state’s long-standing non-discrimination rules. As more states follow suit, employers must take a proactive, multi-jurisdictional approach to compliance.
DCI is continuously tracking all state level legislation, regulations, and rule changes. We will continue to digest these developments and provide further analysis in the coming weeks.