In this Friday post, I shared some technical guidance from the U.S. Equal Employment Opportunity Commission and the U.S. Department of Justice to help employers navigate the Americans with Disabilities Act when using software, algorithms, and artificial intelligence to assess job applicants and employees.
But employers using hiring software can discriminate in other ways.
Earlier this month, the EEOC announced that it had sued three integrated companies that had allegedly programmed their online recruitment software to reject older applicants because of their age automatically. If true, this would violate the Age Discrimination in Employment Act. Here’s more from the EEOC’s press release:
[The companies] hire thousands of tutors based in the United States each year to provide online tutoring from their homes or other remote locations. According to the EEOC’s lawsuit, in 2020, [the companies] programmed their tutor application software to automatically reject female applicants age 55 or older and male applicants age 60 or older. [The companies] rejected more than 200 qualified applicants based in the United States because of their age.
This EEOC has touted this case as an example of why the EEOC recently launched its Artificial Intelligence and Algorithmic Fairness Initiative.
Intentional discrimination against older workers is known as “disparate treatment.” However, the ADEA also prohibits practices that, although facially neutral regarding age, have the effect of harming older workers more than younger workers (known as “disparate impact”) unless the employer can show that it based the practice on reasonable factors other than age.
So regardless of intent, the new EEOC lawsuit is a good reminder for employers to involve employment counsel when expanding or significantly altering hiring efforts in a way that could adversely impact a protected class.