Since late October 2021, when the Equal Employment Opportunity Commission (EEOC) launched its Initiative on Artificial Intelligence (AI) and Algorithmic Fairness, the agency has taken several steps to ensure AI and other emerging tools used in hiring and other employment decisions comply with federal civil rights laws that the agency enforces, including Title VII of the Civil Rights Act of 1964 (Title VII), the Age Discrimination in Employment Act (ADEA), and the Americans with Disabilities Act (ADA). Among other things, the EEOC has hosted disability-focused listening and educational sessions, published technical assistance regarding the ADA and the use of AI and other technologies, and held a public hearing to examine the use of automated systems in employment decisions.

Consistent with its initiative and its Draft Strategic Enforcement Plan for 2023-2027, on May 18, 2023, the EEOC issued new “technical assistance” entitled Assessing Adverse Impact in Software, Algorithms, and Artificial Intelligence Used in Employment Selection Procedures Under Title VII of the Civil Rights Act of 1964. The technical assistance comes on the heels of a joint federal agency declaration on AI and is another step toward the building of a framework for regulating employers’ use of AI based on current law.

The technical assistance, issued in the form of FAQs, answers questions that software developers and employers may have regarding the use of “algorithmic decision making tools” (ADT) in selection decisions such as hiring, promotion, and termination and how to avoid adverse or disproportionately disparate or negative impact on classes protected by the Title VII, the ADEA, and the ADA.

Relying definitions used by federal agencies and in federal legislation, the technical assistance defines terms associated with automated systems and AI used in the guidance, including “software,” “algorithm,” and “artificial intelligence.” It also identifies resume screening software, virtual assistants, and chatbots that interview and evaluate candidates for employment, as well as testing software deployed for personality, aptitude, or cognitive skill testing, as examples of ADTs.

The EEOC has limited the technical assistance’s scope to “selection procedures,” and does not, at least yet, address other aspects of a disparate impact analysis – e.g., whether a selection tool constitutes a valid measure of a job-related trait – nor does it cover application of ADTs to other employment practices of a covered employer. The FAQs also do not cover the potential for disparate treatment or intentional discrimination arising from the use of ADTs.

To assist employers in determining whether the tests and selection procedures using ADTs impact adversely on a protected class or category, the technical assistance relies on the Uniform Selection Guidelines on Employee Selection Procedures (the “Guidelines”), a set of guidelines issued over four decades ago to determine adverse impact. The technical assistance confirms that the Guidelines apply to ADT selection tools.

Applying the Guidelines to ADTs, if the use of a selection tool causes a selection rate for individuals within a protected group or category that is substantially lower (less than 4/5s or 80% - i.e., the “Four-Fifths Rule of Thumb”) than that of the most selected group, a preliminary finding of adverse impact is likely and the employer must examine the ADT to determine if it in fact has an adverse impact. If it does, the employer must show that either the use of the ADT is job related and consistent with business necessity, or that the preliminary Four-Fifths Rule assessment was in error. The technical assistance cautions employers, however, that in the Guidelines analysis might not be appropriate in all circumstances – where, for example, the statistical sampling may be too small (statistically insignificant based on sample size) or where a court might employ a different test to determine disparate impact which could result in a finding unlawful discrimination.

Where an employer relies on a vendor to develop or administer ADTs, the technical assistance states that the employer is responsible for determining if and what kind of adverse impact analysis the vendor has conducted and whether the analysis, if performed, suggests that the tool results in a disparate selection impact on a protected class or category. If so, the technical guidance suggests, but does not specifically mandate, that the employer should explore alternative methods of selection or adjustment to the ADT itself.

While the EEOC states that the technical assistance is intended only to offer guidance and “provide specific clarity,” on the deployment of ADTs, Epstein Becker & Green, advises employers to undertake continued self-critical analysis, inclusive of bias auditing (giving due consideration to the possibility of privileged analyses), of any ADT used in the selection process for hiring, promotion and termination, as well as any other aspect of the employment lifecycle, including compensation, benefits, transfer and demotion. To the extent that an employee or governmental agency challenges an employer’s use of ADTs, whether under the federal laws that the EEOC enforces, or the emerging local laws governing workplace AI, such affirmative and prophylactic actions may be a critical factor in demonstrating compliance and avoiding liability.

Back to Workforce Bulletin Blog

Search This Blog

Blog Editors


Related Services



Jump to Page


Sign up to receive an email notification when new Workforce Bulletin posts are published:

Privacy Preference Center

When you visit any website, it may store or retrieve information on your browser, mostly in the form of cookies. This information might be about you, your preferences or your device and is mostly used to make the site work as you expect it to. The information does not usually directly identify you, but it can give you a more personalized web experience. Because we respect your right to privacy, you can choose not to allow some types of cookies. Click on the different category headings to find out more and change our default settings. However, blocking some types of cookies may impact your experience of the site and the services we are able to offer.

Strictly Necessary Cookies

These cookies are necessary for the website to function and cannot be switched off in our systems. They are usually only set in response to actions made by you which amount to a request for services, such as setting your privacy preferences, logging in or filling in forms. You can set your browser to block or alert you about these cookies, but some parts of the site will not then work. These cookies do not store any personally identifiable information.

Performance Cookies

These cookies allow us to count visits and traffic sources so we can measure and improve the performance of our site. They help us to know which pages are the most and least popular and see how visitors move around the site. All information these cookies collect is aggregated and therefore anonymous. If you do not allow these cookies we will not know when you have visited our site, and will not be able to monitor its performance.