On July 20, 2023, U.S. Senators Bob Casey (D-PA) and Brian Schatz (D-HI) introduced the “No Robot Bosses Act.”  Other than bringing to mind a catchy title for a dystopic science fiction novel, the bill aims to regulate the use of “automated decision systems” throughout the employment life cycle and, as such, appears broader in scope than the New York City’s Local Law 144 of 2021, about which we have previously written, and which New York City recently began enforcing. Although the text of the proposed federal legislation has not yet been widely circulated, a two-page fact sheet released by the sponsoring Senators outlines the bill’s pertinent provisions regarding an employer’s use of automated decision systems affecting employees and would:

  • prohibit employers’ exclusive reliance on automated decision systems;
  • require pre-deployment and periodic testing and validation of automated decision systems to prevent unlawful biases;
  • require operational training;
  • mandate independent, human oversight before using outputs;
  • require timely disclosures of use, data inputs and outputs, and employee rights with respect to the decisions; and
  • establish a regulatory agency at the U.S. Department of Labor (“DOL”) called the “Technology and Worker Protection Division.”

The bill does not define with specificity “automated systems,” nor does it define or limit the term “employment decision.” The fact sheet, however, sets forth examples of automated systems potentially subject to the “No Robot Bosses Act,” including “recruitment software, powered by machine learning algorithms,” “automated scheduling software” and “tracking algorithms” applicable to delivery drivers. These examples suggest a broad intended application that could include other types of monitoring technology.  But the fact sheet does not provide examples of the nature or scope of “employment” decision,” nor does it identify the industries or classes of employees subject to the law.  Moreover, at this time, the bill is silent as to enforcement mechanisms, penalties, or fines for violations.

In addition, Senators Casey, and Schatz, joined by Corey Booker (D-NJ), have also introduced the “Exploitative Workplace Surveillance and Technologies Task Force Act of 2023.” Like the “No Robot Bosses Act,” the text of the bill is not yet available, but the Senators released a one-page fact sheet detailing that the proposed federal legislation would create a “dynamic interagency task force” led by the Department of Labor and the Office of Science and Technology Policy to study a range of issues related to automated systems and workplace monitoring technology.

While these proposed bills are still in their early stages, lawmakers at the state, local, and federal levels continue to consider methods of regulating employment-related automated systems and artificial intelligence more broadly. At the same time, federal regulators and private plaintiffs are leveraging existing employment laws, including Title VII of the Civil Rights Act, in connection with employers’ use of technology that automates employment decisions.  For example, the EEOC recently published a technical assistance memorandum alerting and assisting employers to mitigate risk when using ‘automated decision tools’ in the workplace. Consequently, it is critical that employers, especially personnel involved in recruiting, hiring, and promotion,  identify and assess potential risk in the use of AI tools in employment decision-making by:

  1. Understanding and documenting the systems and vendors used in making employment decisions throughout the employment life cycle;
  2. Assessing the need for an artificial intelligence governance framework, or other internal policies and procedures taking into account considerations related to safety, algorithmic discrimination, data privacy, transparency, and human oversight and fallback;
  3. In conjunction with counsel, conducting impact assessments as to the use of automated systems; and
  4. Ensuring compliance with all applicable laws governing automated decision systems and artificial intelligence.

Please contact Epstein Becker Green’s multidisciplinary artificial intelligence team for questions about automated decision systems.

Back to Workforce Bulletin Blog

Search This Blog

Blog Editors

Authors

Related Services

Topics

Archives

Jump to Page

Subscribe

Sign up to receive an email notification when new Workforce Bulletin posts are published:

Privacy Preference Center

When you visit any website, it may store or retrieve information on your browser, mostly in the form of cookies. This information might be about you, your preferences or your device and is mostly used to make the site work as you expect it to. The information does not usually directly identify you, but it can give you a more personalized web experience. Because we respect your right to privacy, you can choose not to allow some types of cookies. Click on the different category headings to find out more and change our default settings. However, blocking some types of cookies may impact your experience of the site and the services we are able to offer.

Strictly Necessary Cookies

These cookies are necessary for the website to function and cannot be switched off in our systems. They are usually only set in response to actions made by you which amount to a request for services, such as setting your privacy preferences, logging in or filling in forms. You can set your browser to block or alert you about these cookies, but some parts of the site will not then work. These cookies do not store any personally identifiable information.

Performance Cookies

These cookies allow us to count visits and traffic sources so we can measure and improve the performance of our site. They help us to know which pages are the most and least popular and see how visitors move around the site. All information these cookies collect is aggregated and therefore anonymous. If you do not allow these cookies we will not know when you have visited our site, and will not be able to monitor its performance.