On August 9, 2023, the U.S. Equal Employment Opportunity Commission (“EEOC”) and iTutorGroup, Inc. and related companies (collectively, “iTutorGroup”) filed a joint notice of settlement and a request for approval and execution of a consent decree, effectively settling claims that the EEOC brought last year against iTutorGroup regarding its application software. The EEOC claimed in its lawsuit that iTutorGroup violated the Age Discrimination in Employment Act (“ADEA”) by programming its application software to automatically reject hundreds of female applicants age 55 or older and male applicants age 60 or older.
The proposed iTutorGroup settlement has been interpreted by many commentators as a sign that the EEOC is sharpening its strategic focus on the use of automated systems, including artificial intelligence (“AI”) or machine learning (“ML”), in the recruitment and hiring process. It also appears to be the first settlement of a purported AI discrimination lawsuit brought by the EEOC. While the EEOC’s Artificial Intelligence and Algorithmic Fairness Initiative focuses on the discriminatory impact of AI and ML used in hiring and employment decisions, however, the iTutorGroup case involved allegations that the iTutorGroup intentionally programmed its tutor application software to categorically exclude, and therefore intentionally discriminate against, older applicants under a disparate treatment theory. In other words, human beings intended the discriminatory inputs and results. The iTutorGroup settlement requires that the company implement extensive monitoring, reporting, recording-keeping and compliance procedures in order for the company to be able to resume soliciting, receiving, or considering tutor applicants located in the United States.
Here are four key points that employers should note:
- Disparate Impact Remains a Concern. Many forms of workplace AI tools exist, the most common of which are those used for selection of candidates. Vendors offering AI-powered tools often program these automated systems to be facially neutral with respect to age, race, and gender. Even where employers may not expressly exclude candidates based on a protected category, however, disparate impact remains a concern. Employers using such tools should consider commissioning a bias audit to ensure compliance with federal and state employment laws. Engaging counsel to conduct the audit increases the likelihood that the resulting analysis would be protected from disclosure by the attorney-client privilege. Furthermore, employers using “Automated Employment Decision Tools” in New York City are required to conduct a bias audit and publish a summary of the results of that audit.
- EEOC’s Focus on Artificial Intelligence Is Not Limited To Disparate Impact Claims. The EEOC has repeatedly warned employers using AI or ML tools to ensure that such tools are not designed to incorporate discriminatory factors, do not include discriminatory design, and that federal authorities would apply existing legal frameworks, such as disparate treatment theory, in their enforcement efforts. Employers should consider engaging counsel to audit their employment-related documents, practices, policies, and procedures. This may help companies catch issues early on, rather than at a later stage, for example, during a potential acquisition or future funding round.
- While Maintaining the Importance of Race and Gender Analyses, Employers Should Still Keep in Mind All Protected Characteristics. When analyzing AI and ML tools, many regulatory authorities and commentators focus on the impact on race and gender. Employers, however, should keep in mind all characteristics protected by applicable federal, state, and local laws. For example, in May 2022, the EEOC released guidance regarding how an employer’s use of software, algorithms, and/or AI could run afoul of the Americans with Disabilities Act if it results in a disadvantage to job applicants and employees with disabilities.
- Foreign Companies Operating in the United States May Be Subject to U.S. Employment Laws. The iTutorGroup case is a reminder that foreign companies doing business in the United States that recruit or employ workers may be subject to federal employment and other discrimination laws. According to the EEOC’s complaint, the tutors recruited by iTutorGroup were paid on an hourly basis under 6-month contracts to provide remote English-language tutoring to customers located in China. Companies should be reminded of the EEOC’s guidance issued in April 2003 regarding the rights of employees working for multinational employers, and work with employment counsel to ensure that their policies, practices, and procedures are compliant with applicable employment laws to avoid being hailed into a U.S. federal court to litigate employment claims.
* * *
There is a significant increase in proposed legislation seeking to govern automated employment decision tools. We previously reported on New York City’s Local Law 144, which went into effect earlier this year, and published an Insight on the Illinois Biometric Information Privacy Act, which has given rise to significant litigation since it was enacted in 2008. We are also tracking and analyzing the proposed federal legislation under the “No Robot Bosses Act.” Epstein Becker Green has an established history in counseling and defending businesses and providing guidance with respect to the creation and use of AI and ML tools. We encourage employers utilizing or considering the use of automated tools in the workplace that rely on AI or ML to stay tuned as we cover future developments in a rapidly-changing space.
Blog Editors
Authors
- Member of the Firm
- Senior Counsel
- Member of the Firm
- Of Counsel