DCI Consulting Blog

AI in Employment: A Deep Dive Into OFCCP's Newly Released Resources

Written by Sarah Layman, M.S. | May 8, 2024 6:56:34 PM

By: Sarah Layman and Dave Schmidt

Last week, DCI shared that the Office of Federal Contract Compliance Programs (OFCCP) had published new guidance for federal contractors on the use of artificial intelligence (AI) in employment decision making. In it, OFCCP answered commonly asked questions about the legal duties of federal contractors regarding the use of AI in an employment context, and shared "promising practices” for mitigating potential biases that can be introduced or perpetuated through the use of AI systems.

As federal contractors begin using AI systems to enhance productivity and efficiency in their hiring processes, understanding these FAQs and promising practices becomes paramount for ensuring fair and equitable employment practices, and mitigating risks. Below, DCI summarizes both sections of the new guidance.

Common Questions

Questions 1, 2, and 3 provide definitions for key terms—Artificial Intelligence, Algorithm, and Automated Systems. This is particularly useful given that AI is an extremely broad, academic discipline with many different and emerging definitions and sub-fields. Of note, these definitions may prove to be particularly useful for assessing whether a decision-making tool is covered by future regulations that may be released.

In Question 4, OFCCP emphasizes that federal contractors must comply with anti-discrimination laws, regardless of whether their systems leverage AI. OFCCP will continue to use compliance evaluations to ensure contractors meet long-standing EEO obligations, regardless of whether they use more traditional or AI-based systems. This includes, but is not limited to, record keeping obligations (e.g., keeping records of all internal and external resume searches) and providing reasonable accommodations to applicants.

Question 5 describes risks that can be introduced via the use of AI in employment decision making. Namely, that it may inadvertently embed bias and discrimination, potentially replicating or exacerbating existing inequalities in the workplace and violating workers' civil rights. They provide an example wherein a resume scanner that automatically rejects candidates with resume gaps may unlawfully impact women or individuals with disabilities.

Questions 6 and 7 address whether and how OFCCP investigates the use of AI systems. OFCCP “investigates the use of AI during compliance evaluations and complaint investigations to determine whether a federal contractor is in compliance with its nondiscrimination obligations.” When AI-based selection procedures result in adverse impact employment decisions based on race, sex, or ethnicity, contractors must validate the system according to applicable laws and the Uniform Guidelines on Employee Selection Procedures, including analyzing job relevance, assessing bias, and exploring alternative selection procedures. This is no different than the requirements for all selection procedures that have been in place for well over forty years.

Questions 8 states that OFCCP does not endorse or certify AI software compliance with enforced laws, whether it be developed by the contractor or acquired from a vendor. This should not be surprising as context and implementation details are critical considerations in the determination of adverse impact, as well as in establishing the validity of a selection procedure (i.e., the same selection procedure implemented differently may yield different conclusions).

Question 9 discusses compliance responsibility. They note that federal contractors cannot delegate compliance obligations—contractors are responsible for ensuring adherence to nondiscrimination and affirmative action laws, even when using third-party products or services. During compliance reviews or investigations, contractors must provide requested information to demonstrate compliance.

Question 10 points readers to an extensive list of resources to learn more, including additional guidance from OFCCP on issues of compliance that may be relevant to the use of AI systems, as well as resources from the Department of Labor, the White House and other federal agencies including the National Institute of Standards and Technology.

Promising Practices

The second section of the guidance outlines promising practices for federal contractors regarding the development and use of AI in the EEO context, organized into four categories. While federal contractors are not explicitly required to partake in these practices, these are useful steps contractors may want to take to promote trust and safeguard against potential harms to applicants. Each is listed and briefly summarized below.

  • Providing notice that AI systems are being used: Federal contractors should provide advance notice and appropriate disclosure and transparency to applicants and employees about AI usage, data collection and usage, privacy safeguards, and processes for requesting a reasonable accommodation.
  • Developing and Implementing AI systems: When employing an AI system, federal contractors should prioritize process standardization in its implementation, ensure human oversight throughout the process, monitor and analyze adverse impact, and maintain documentation regarding its development and deployment. Further, contractors are encouraged to engage with employees in the design and deployment of AI systems. Finally, contractors are advised to consider establishing internal governance related to the us AI-based technologies.
  • Using Vendor-Created AI Systems: When acquiring an AI system from a vendor, contractors should verify and/or understand critical components of the AI selection procedure, such as: data source and quality (in both operational use and model training/building), data privacy practices, details of how the selection procedure operates (e.g., how scores or classifications are created, basis for decisions, predictive nature of the procedure and alignment to business need), and the potential impact of differences between the AI selection procedure training and development relative to the contractor’s implementation use case. Further, contractors are encouraged to confirm reliability, security, fairness, and transparency/explainability of the AI selection procedure.
  • Ensuring Accessibility and Disability and Inclusion: Contractors must provide reasonable accommodations for individuals with disabilities; ensure AI systems are accessible, inclusive in design, and measure essential functions of the job. Contractors should regularly test and monitor accessibility for various disabilities, regardless of whether they use AI systems and/or a vendor.

Summary

The FAQ section of this guidance highlights the necessity for federal contractors using AI in employment decisions to anticipate OFCCP evaluations, and that these evaluations will continue to follow the long-established standards outlined in UGESP. Thus, whether using AI systems or not, it will continue to be essential to prioritize monitoring for adverse impact and conducting validation research to ensure compliance.

While not mandatory, the promising practices outline expectations for contractors to mitigate potential bias and ensure trustworthy AI development and utilization. They emphasize the importance of understanding the design, development, intended use, and impacts of AI systems in employment decision-making, whether internally developed or procured from vendors. Many of these promising practices are well-aligned to professional guidance, prior federal guidance, or trends stemming from enacted and proposed local and state laws. That said, some may be challenging to implement given the evolving nature of some of the AI selection procedures.