On August 9, 2023, the U.S. Equal Employment Opportunity Commission (“EEOC”) and iTutorGroup, Inc. and related companies (collectively, “iTutorGroup”) filed a joint notice of settlement  and a request for approval and execution of a consent decree, effectively settling claims that the EEOC brought last year against iTutorGroup regarding its application software.  The EEOC claimed in its lawsuit that iTutorGroup violated the Age Discrimination in Employment Act (“ADEA”) by programming its application software to automatically reject hundreds of female applicants age 55 or older and male applicants age 60 or older.

The proposed iTutorGroup settlement has been interpreted by many commentators as a sign that the EEOC is sharpening its strategic focus on the use of automated systems, including artificial intelligence (“AI”) or machine learning (“ML”), in the recruitment and hiring process.  It also appears to be the first settlement of a purported AI discrimination lawsuit brought by the EEOC.  While the EEOC’s Artificial Intelligence and Algorithmic Fairness Initiative focuses on the discriminatory impact of AI and ML used in hiring and employment decisions, however, the iTutorGroup case involved allegations that the iTutorGroup intentionally programmed its tutor application software to categorically exclude, and therefore intentionally discriminate against, older applicants under a disparate treatment theory.  In other words, human beings intended the discriminatory inputs and results.  The iTutorGroup settlement requires that the company implement extensive monitoring, reporting, recording-keeping and compliance procedures in order for the company to be able to resume soliciting, receiving, or considering tutor applicants located in the United States. 

Here are four key points that employers should note:

  1. Disparate Impact Remains a Concern.  Many forms of workplace AI tools exist, the most common of which are those used for selection of candidates.  Vendors offering AI-powered tools often program these automated systems to be facially neutral with respect to age, race, and gender.  Even where employers may not expressly exclude candidates based on a protected category, however, disparate impact remains a concern.  Employers using such tools should consider commissioning a bias audit to ensure compliance with federal and state employment laws.  Engaging counsel to conduct the audit increases the likelihood that the resulting analysis would be protected from disclosure by the attorney-client privilege.  Furthermore, employers using “Automated Employment Decision Tools” in New York City are required to conduct a bias audit and publish a summary of the results of that audit. 
  2. EEOC’s Focus on Artificial Intelligence Is Not Limited To Disparate Impact Claims.  The EEOC has repeatedly warned employers using AI or ML tools to ensure that such tools are not designed to incorporate discriminatory factors, do not include discriminatory design, and that federal authorities would apply existing legal frameworks, such as disparate treatment theory, in their enforcement efforts.  Employers should consider engaging counsel to audit their employment-related documents, practices, policies, and procedures.  This may help companies catch issues early on, rather than at a later stage, for example, during a potential acquisition or future funding round. 
  3. While Maintaining the Importance of Race and Gender Analyses, Employers Should Still Keep in Mind All Protected Characteristics.  When analyzing AI and ML tools, many regulatory authorities and commentators focus on the impact on race and gender.  Employers, however, should keep in mind all characteristics protected by applicable federal, state, and local laws.  For example, in May 2022, the EEOC released guidance regarding how an employer’s use of software, algorithms, and/or AI could run afoul of the Americans with Disabilities Act if it results in a disadvantage to job applicants and employees with disabilities. 
  4. Foreign Companies Operating in the United States May Be Subject to U.S. Employment Laws.  The iTutorGroup case is a reminder that foreign companies doing business in the United States that recruit or employ workers may be subject to federal employment and other discrimination laws.  According to the EEOC’s complaint, the tutors recruited by iTutorGroup were paid on an hourly basis under 6-month contracts to provide remote English-language tutoring to customers located in China.  Companies should be reminded of the EEOC’s guidance issued in April 2003 regarding the rights of employees working for multinational employers, and work with employment counsel to ensure that their policies, practices, and procedures are compliant with applicable employment laws to avoid being hailed into a U.S. federal court to litigate employment claims. 

* * *

There is a significant increase in proposed legislation seeking to govern automated employment decision tools.  We previously reported on New York City’s Local Law 144, which went into effect earlier this year, and published an Insight on the Illinois Biometric Information Privacy Act, which has given rise to significant litigation since it was enacted in 2008.  We are also tracking and analyzing the proposed federal legislation under the “No Robot Bosses Act.”  Epstein Becker Green has an established history in counseling and defending businesses and providing guidance with respect to the creation and use of AI and ML tools.  We encourage employers utilizing or considering the use of automated tools in the workplace that rely on AI or ML to stay tuned as we cover future developments in a rapidly-changing space.


Back to Workforce Bulletin Blog

Search This Blog

Blog Editors

Authors

Related Services

Topics

Archives

Jump to Page

Subscribe

Sign up to receive an email notification when new Workforce Bulletin posts are published:

Privacy Preference Center

When you visit any website, it may store or retrieve information on your browser, mostly in the form of cookies. This information might be about you, your preferences or your device and is mostly used to make the site work as you expect it to. The information does not usually directly identify you, but it can give you a more personalized web experience. Because we respect your right to privacy, you can choose not to allow some types of cookies. Click on the different category headings to find out more and change our default settings. However, blocking some types of cookies may impact your experience of the site and the services we are able to offer.

Strictly Necessary Cookies

These cookies are necessary for the website to function and cannot be switched off in our systems. They are usually only set in response to actions made by you which amount to a request for services, such as setting your privacy preferences, logging in or filling in forms. You can set your browser to block or alert you about these cookies, but some parts of the site will not then work. These cookies do not store any personally identifiable information.

Performance Cookies

These cookies allow us to count visits and traffic sources so we can measure and improve the performance of our site. They help us to know which pages are the most and least popular and see how visitors move around the site. All information these cookies collect is aggregated and therefore anonymous. If you do not allow these cookies we will not know when you have visited our site, and will not be able to monitor its performance.