Silicon Valley

The data on which the artificial-intelligence algorithm was trained created a preference for male candidates.

The news: According to a report by Reuters, Amazon began developing an automated system in 2014 to rank job seekers with one to five stars. But last year, the company scrapped the project after seeing it had developed a preference for male candidates in technical roles.

Why? The AI tool was trained on 10 years’ worth of résumés the company had received. Because tech is a male-dominated industry, the majority of those résumés came from men.

The result: The system was unintentionally trained to choose male candidates over female candidates. It would reportedly penalize résumés containing the word “women’s” or the names of certain all-women colleges. Although Amazon made changes to make these terms neutral, the company lost confidence that the program was indeed gender neutral in all other areas.

Why it matters: We can’t treat artificial intelligence as inherently unbiased. Training the systems on biased data means the algorithms also become biased. If unfair AI hiring programs like this aren’t uncovered before being implemented, they will perpetuate long-standing diversity issues in business rather than solve them.

Want to keep up with how technology is changing the workplace of the future? Sign up for our future of work newsletter, Clocking In.