Since late October 2021, when the Equal Employment Opportunity Commission (EEOC) launched its Initiative on Artificial Intelligence (AI) and Algorithmic Fairness, the agency has taken several steps to ensure AI and other emerging tools used in hiring and other employment decisions comply with federal civil rights laws that the agency enforces, including Title VII of the Civil Rights Act of 1964 (Title VII), the Age Discrimination in Employment Act (ADEA), and the Americans with Disabilities Act (ADA). Among other things, the EEOC has hosted disability-focused listening and educational sessions, published technical assistance regarding the ADA and the use of AI and other technologies, and held a public hearing to examine the use of automated systems in employment decisions.
Consistent with its initiative and its Draft Strategic Enforcement Plan for 2023-2027, on May 18, 2023, the EEOC issued new “technical assistance” entitled Assessing Adverse Impact in Software, Algorithms, and Artificial Intelligence Used in Employment Selection Procedures Under Title VII of the Civil Rights Act of 1964. The technical assistance comes on the heels of a joint federal agency declaration on AI and is another step toward the building of a framework for regulating employers’ use of AI based on current law.
The technical assistance, issued in the form of FAQs, answers questions that software developers and employers may have regarding the use of “algorithmic decision making tools” (ADT) in selection decisions such as hiring, promotion, and termination and how to avoid adverse or disproportionately disparate or negative impact on classes protected by the Title VII, the ADEA, and the ADA.
Relying definitions used by federal agencies and in federal legislation, the technical assistance defines terms associated with automated systems and AI used in the guidance, including “software,” “algorithm,” and “artificial intelligence.” It also identifies resume screening software, virtual assistants, and chatbots that interview and evaluate candidates for employment, as well as testing software deployed for personality, aptitude, or cognitive skill testing, as examples of ADTs.
The EEOC has limited the technical assistance’s scope to “selection procedures,” and does not, at least yet, address other aspects of a disparate impact analysis – e.g., whether a selection tool constitutes a valid measure of a job-related trait – nor does it cover application of ADTs to other employment practices of a covered employer. The FAQs also do not cover the potential for disparate treatment or intentional discrimination arising from the use of ADTs.
To assist employers in determining whether the tests and selection procedures using ADTs impact adversely on a protected class or category, the technical assistance relies on the Uniform Selection Guidelines on Employee Selection Procedures (the “Guidelines”), a set of guidelines issued over four decades ago to determine adverse impact. The technical assistance confirms that the Guidelines apply to ADT selection tools.
Applying the Guidelines to ADTs, if the use of a selection tool causes a selection rate for individuals within a protected group or category that is substantially lower (less than 4/5s or 80% - i.e., the “Four-Fifths Rule of Thumb”) than that of the most selected group, a preliminary finding of adverse impact is likely and the employer must examine the ADT to determine if it in fact has an adverse impact. If it does, the employer must show that either the use of the ADT is job related and consistent with business necessity, or that the preliminary Four-Fifths Rule assessment was in error. The technical assistance cautions employers, however, that in the Guidelines analysis might not be appropriate in all circumstances – where, for example, the statistical sampling may be too small (statistically insignificant based on sample size) or where a court might employ a different test to determine disparate impact which could result in a finding unlawful discrimination.
Where an employer relies on a vendor to develop or administer ADTs, the technical assistance states that the employer is responsible for determining if and what kind of adverse impact analysis the vendor has conducted and whether the analysis, if performed, suggests that the tool results in a disparate selection impact on a protected class or category. If so, the technical guidance suggests, but does not specifically mandate, that the employer should explore alternative methods of selection or adjustment to the ADT itself.
While the EEOC states that the technical assistance is intended only to offer guidance and “provide specific clarity,” on the deployment of ADTs, Epstein Becker & Green, advises employers to undertake continued self-critical analysis, inclusive of bias auditing (giving due consideration to the possibility of privileged analyses), of any ADT used in the selection process for hiring, promotion and termination, as well as any other aspect of the employment lifecycle, including compensation, benefits, transfer and demotion. To the extent that an employee or governmental agency challenges an employer’s use of ADTs, whether under the federal laws that the EEOC enforces, or the emerging local laws governing workplace AI, such affirmative and prophylactic actions may be a critical factor in demonstrating compliance and avoiding liability.
Blog Editors
Authors
- Member of the Firm
- Member of the Firm
- Of Counsel
- Member of the Firm