Amazon's AI Recruitment Tool Scrapped for Gender Bias

The Case
It has been reported that Amazon have had to abandon an artificial intelligence recruitment tool after it was found to be biassed against female candidates. The system was trained on data submitted by job applicants over a decade. A significant portion of this data came from male candidates which caused the system to develop a bias, favouring male candidates over females. Despite efforts to rectify the bias, the project was eventually scrapped. However, for a period, recruiters at Amazon did use the tool for recommendations but never solely relied on it for final decisions.
Key Issues
This example highlights a common theme amongst AI systems: the creation of discriminatory outcomes. Whilst Amazon were able to detect the bias, its possible to foresee more serious effects, for instance if the tool was actively used to decide the allocation of promotions or pay rises, paving the way for novel discrimination cases in employment law.