Categories: Technology

We recently blogged about recent gender discrimination lawsuits filed against technology industry employers. Following in the wake of these lawsuits have been news stories regarding the lack of diversity in the technology industry. The scale of the statistical disparity, (for example, 90% of Twitter’s technical employees are male), creates major litigation risks for companies seeking to remedy this disparity. Technology companies eager to accept social responsibility for correcting these discrepancies must be careful not to inadvertently invite legal liability for them as well.

Although there seems to be a consensus that lack of diversity is a problem in the technology industry that should be addressed, there is a great deal of disagreement over how to address the problem. Some groups, such as Jesse Jackson Sr.’s Rainbow PUSH Coalition have focused on publicizing employee population statistics in an effort to bring the issue out in the open. However, employers are still experimenting with possible strategies to address the problem.

Class action lawsuits designed to punish employers with large monetary penalties without providing solutions to the disparities threaten this process of experimentation. In a recently filed class action lawsuit against Twitter,[1] a former employee is seeking to represent a class of all current and former female employees who were allegedly denied promotions as a result of gender discrimination. Rather than pointing to a particular discriminatory act or individual, the Complaint cites the employer’s use of “subjective requirements” and “unconscious prejudices and gender-based stereotypes” as the reason why few female employees receive promotions.

A distressing feature of the Complaint is how it incorporates the employer’s attempts to address diversity issues against the employer. It uses the employer’s “internal diversity studies, focusing on barriers to women’s advancement… bias mitigation training” and treatment of gender disparity “as a company-wide issue by partnering with many organizations to continue improving its diversity standing and move the needle” as evidence that discrimination should be adjudicated on a class wide basis. The Complaint also uses employee population statistics published at the urging of civil rights groups as evidence of discriminatory bias.

The Complaint appears to be laying the groundwork for a legal theory that has thus far not been successful. The theory, as discussed in the complaint and in early cases, looks to social scientific research regarding the phenomenon of “implicit bias” meaning “a person’s automatic preference for one [classification] over another.”[2] Courts have been presented with evidence in early cases concerning experiments claiming to support that conclusion that as many as 70% of Americans display an implicit bias in favor of whites over blacks.[3] “Implicit” bias claims provide plaintiff’s lawyers with a mechanism to attribute to discrimination statistical disparities among protected classes in an employee population. Combined with evidence of wide gender and racial disparities this has the potential for massive damages in class action lawsuits.

One of the best examples of how seeking to improve diversity can have unintended legal consequences is the Implicit Association Test (“IAT”). The IAT is designed to measure implicit bias by measuring the time it takes to associate good and bad traits with a particular classification. The most common iteration of the test shows that a substantial portion of test-takers implicitly associate faces of Caucasians with good things, and faces of African Americans with bad things. Many implicit bias training procedures involve having participants take the IAT.[4]

Thus far, plaintiffs have had difficulty having results of the IAT admitted as evidence in litigation. Generally, plaintiffs have  tried to argue: (1) the IAT demonstrates that most people exhibit implicit bias; (2) decision makers are statistically likely to exhibit implicit bias so decision makers at defendant exhibit implicit bias; (3) people who exhibit implicit bias will unconsciously discriminate based upon that bias; ergo (4) statistical disparities based on race or gender within defendant’s employee population are the result of implicit bias. Although there are many parts of this argument susceptible to attack, thus far the most successful defense has been to attack point (2) by arguing that even if implicit bias is rampant in the general population, there is no evidence that a particular decision maker exhibits implicit bias.[5] Plaintiffs can negate this argument by entering individual decision maker’s IAT results into evidence.

The best way to stop a decision maker’s IAT results from being admitted as evidence is for that decision maker not to take the IAT. Documents memorializing IAT results may be produced during discovery and if results are not memorialized they could still be discovered through deposition testimony. Fortunately, courts have declined to order employees to take the IAT as part of discovery.[6]

The IAT is a popular tool in implicit bias training, because it is free and quick to administer and the results usually indicate that many of the test-takers exhibit implicit biases against protected classes. However, it provides plaintiffs with evidence of discrimination which would otherwise be unavailable.[7]

Employers should work to create a more diverse workforce, but they should take care to do it appropriately. The stakes are high, plaintiff’s lawyers are seeking to pin the monetary cost of centuries of discrimination on individual employers and have shown a willingness to use attempts to solve problems and improve the situation as evidence against employers. Human resources initiatives aimed at increasing diversity or alleviating bias should be thoroughly audited to ensure they won’t appear as Plaintiff’s Exhibit A in the near future. The stakes, monetary responsibility for the end result of hundreds of years of discrimination, are too high to take unnecessary risks.



[1] Huang v. Twitter, Inc., CGC-15-544813

[2] Pippen v. Iowa, 854 N.W. 2d 1 (Iowa 2014)

[3] Id.

[4] In Jaffe v. Morgan Stanley & Co., 2008 U.S. Dist. LEXIS 12208 (N.D. Cal. Feb. 7, 2008)  the court approved a settlement agreement where Defendant agreed “to provide diversity related training to field sales branch management which incorporates elements of the Implicit Association Test or similar tool agreed upon by the parties.”

[5] See e.g., Pippen v. Iowa, Iowa Dist. Ct., No. 107038 (2012); Jones v. Nat'l Council of Young Men's Christian Associations of the United States, 2014 U.S. Dist. LEXIS 43866 (N.D. Ill. Mar. 31, 2014).

[6] Palgut v. City of Colo. Springs, 2008 U.S. Dist. LEXIS 123115 (D. Colo. July 3, 2008) (denying plaintiff’s Rule 35 motion to compel defendant’s employees to complete the IAT).

[7] Readers whose decisions may be scrutinized for implicit bias may not want to take the IAT, which is administered through Harvard University and takes fewer than 10 minutes to complete.

Back to Workforce Bulletin Blog

Search This Blog

Blog Editors

Related Services



Jump to Page


Sign up to receive an email notification when new Workforce Bulletin posts are published:

Privacy Preference Center

When you visit any website, it may store or retrieve information on your browser, mostly in the form of cookies. This information might be about you, your preferences or your device and is mostly used to make the site work as you expect it to. The information does not usually directly identify you, but it can give you a more personalized web experience. Because we respect your right to privacy, you can choose not to allow some types of cookies. Click on the different category headings to find out more and change our default settings. However, blocking some types of cookies may impact your experience of the site and the services we are able to offer.

Strictly Necessary Cookies

These cookies are necessary for the website to function and cannot be switched off in our systems. They are usually only set in response to actions made by you which amount to a request for services, such as setting your privacy preferences, logging in or filling in forms. You can set your browser to block or alert you about these cookies, but some parts of the site will not then work. These cookies do not store any personally identifiable information.

Performance Cookies

These cookies allow us to count visits and traffic sources so we can measure and improve the performance of our site. They help us to know which pages are the most and least popular and see how visitors move around the site. All information these cookies collect is aggregated and therefore anonymous. If you do not allow these cookies we will not know when you have visited our site, and will not be able to monitor its performance.