By: Dave Schmidt, Amanda Allen, and Bre Timko
Recently, New York City’s law related to the use of automated employment decision tools (where hiring/promotion decisions involve algorithmically-driven mechanisms) has garnered much attention. Other local/state entities (e.g., California, New Jersey, Washington DC) have also been considering rules related to the use of selection procedures that use artificial intelligence. However, there is also increased activity level in this area from both the Equal Employment Opportunity Commission (EEOC) and the Office of Federal Contract Compliance Programs (OFCCP). These agencies have been spending a lot of time considering how to proceed as it relates to the use of these emerging technologies in selection practices, so the increased activity level should not come as a surprise. In fact, the momentum will likely continue to increase on this front at both EEOC and OFCCP over the next several years.
There are three recent activities on the federal front to note:
- November 2022 – OFCCP publishes a proposed compliance review scheduling letter which includes requirements regarding the use of artificial intelligence (AI)
- January 2023 – EEOC hearing on AI scheduled for January 31
- January 2023 – EEOC published its proposed Strategic Enforcement Plan for 2023-2027 which includes the use of AI as a priority
Proposed Changes to the OFCCP Scheduling Letter
In November 2022, changes to the scheduling letter were proposed by OFCCP (for a synopsis of all of the proposed changes, please see this prior DCI blog). Of particular interest to the artificial intelligence community, and employers who are using selection procedures enabled by these emerging technologies, is Item 19 under Support Data of the Itemized Listing in the Compliance Review Scheduling Letter, wherein contractors would provide the agency with:
“Documentation of a contractor’s policies and practices regarding all employment recruiting, screening and hiring mechanisms, including the use of artificial intelligence, algorithms, automated systems, or other technology-based selection procedures.” (emphasis added)
The specific call-out of these emerging technologies is notable as it suggests that OFCCP may be paying extra attention to this area, and the agency may ask organizations to provide more detail to help them understand how these selection procedures are being used.
EEOC Hearing on Artificial Intelligence Scheduled for January 31
EEOC announced that they will be holding a virtual hearing from 10:00 a.m. to 4:00 p.m. EST on January 31, 2023—Navigating Employment Discrimination in AI and Automated Systems: A New Civil Rights Frontier. The focus will be “to examine the use of automated systems, including artificial intelligence (AI), in employment decisions.”
EEOC has 11 panelists who will discuss artificial intelligence (as well as other automated mechanisms) in the context of civil rights implications for applicants and employees. These panelists represent multiple disciplines (e.g., legal, data science, civil advocacy, and industrial and organizational psychology). With the increased use of artificial intelligence-based methods in selection practices, there are concerns regarding the extent to which these may “encode harmful biases and result in unlawful discrimination”. This hearing will also delve into the ways in which these technologies might be able to support diversity, equity, inclusion, and accessibility.
Register here if you want to attend this hearing.
EEOC Proposed Strategic Enforcement Plan for 2023-2027
EEOC published a draft Strategic Enforcement Plan (SEP) for fiscal years 2023-2027 in the Federal Register on January 10, 2023. The purpose of the SEP is to outline EEOC’s priorities and coordinate its work over a multiple fiscal year period. The SEP included references to artificial intelligence tools, focused on those used by employers for hiring which may introduce discriminatory decision-making.
Some notable “Subject Matter Priorities” that were called out in the plan include a focus on recruitment and hiring policies and practices that could result in discrimination against protected groups from these emerging technologies, such as:
- “the use of automated systems, including artificial intelligence or machine learning, to target job advertisements, recruit applicants, or make or assist in hiring decisions where such systems intentionally exclude or adversely impact protected groups;”
- “screening tools or requirements that disproportionately impact workers based on their protected status, including those facilitated by artificial intelligence or other automated systems, pre-employment tests, and background checks.”
Further, in “Addressing Selected Emerging and Developing Issues” EEOC notes that they:
“… will focus on employment decisions, practices, or policies in which covered entities' use of technology contributes to discrimination based on a protected characteristic. These may include, for example, the use of software that incorporates algorithmic decision-making or machine learning, including artificial intelligence; use of automated recruitment, selection, or production and performance management tools; or other existing or emerging technological tools used in employment decisions.”
The public has until February 9, 2023, to submit comments through the Federal Register. While this development is not unexpected as this expands from work published in 2022 (from EEOC, DOJ, and NIST), it further demonstrates federal agencies taking notice of the potential for technology-enabled employment discrimination when artificial intelligence tools are used for recruiting, hiring, and termination.
It is increasingly clear that the focus on regulation of algorithmically-driven selection procedures will not be dissipating anytime soon. While you should pay attention to the activities in this area at the state and local level, you will also certainly want to keep an eye on the developments within federal agencies. DCI will provide periodic updates on this front to help you stay connected to these happenings.