How do you hire only the “best” employees? How does any employer find the time to “vet” the hundreds or even thousands of job seekers applying for positions? A number of employers have turned to “big data” – the use of various algorithms to quickly analyze competing employees for coveted positions. These algorithms contain series of predictive data variables that are designed to allow HR departments and headhunters alike to more efficiently filter through the virtual avalanche of applications to find the best talent to fill the many jobs available in today’s economy. These programs also allow employers to filter through what otherwise would be endless talent searches by including—and excluding—resumes that include certain keywords submitted through their online application systems. However, as with most innovations in technology, this one carries its share of potential risks for employers.

Employers’ electronic vetting may present a problem for employees whose applications might tend to be excluded, based upon the programming of the algorithm. At this time, agency guidance from the EEOC on this specific issue is sparse, as the agency itself is continuing to learn more about these technologies. Indeed in October, the EEOC held a public meeting in Washington, D.C., regarding the use of “big data” to make decisions concerning its proper use in the context of criminal background checks. Noting that while “big data” has the potential to greatly reduce unlawful bias, the EEOC reported in a recent press release a warning from “big data” researchers: “[A]lgorithms may be trained to predict outcomes which are themselves the result of previous discrimination.” Thus, the researchers warn that the programming of these algorithms, obviously done by human programmers, may be set to recognize criteria that may be discriminatory.

As recognized by “big data” researchers, employers who use “big data” in employment decisions, including where they contract with third parties to provide such services, are at risk for discrimination claims on a myriad of protected classes. Through the use of certain algorithms, employers may unwittingly engage in discrimination against job seekers on the basis of age, race, gender, or even disability, which may result in disparate impact discrimination claims. Employers should take steps to mitigate risk by closely monitoring use of big data systems, including retaining and analyzing the data on excluded applications and the algorithms used in the process of excluding them.