Sections

ideals
Business Essentials for Professionals

Secret AI Recruiting Tool That Discriminated Against Women Abandoned By Amazon


10/10/2018


Secret AI Recruiting Tool That Discriminated Against Women Abandoned By Amazon
The machine-learning based recruiting engine of Amazon.com reportedly does not like women. This was discovered recently by its artificial intelligence specialists.   
 
This machine learning program was being developed by Amazon’s experts 2014 and is designed to examine resumes of job applicants with the ultimate aim of automating the search program for the best talent, reported the news agency Reuters citing sources.
 
One of the core competencies behind the domination of Amazon in the e-commerce industry is automation – whether in its warehouses or in helping in taking pricing decisions. The artificial intelligence powered was being tested on an experimental basis where job seekers were granted scores ranging from one to five stars, reported Reuters.  
 
“Everyone wanted this holy grail,” one of the people reportedly told Reuters. “They literally wanted it to be an engine where I’m going to give you 100 resumes, it will spit out the top five, and we’ll hire those.”
 
The company realised the fault in its new system in 2015 and identified that the system was not gender neutral while selecting candidates for software developer jobs and other technical positions.
 
The reason for that anomaly was also identified by experts at Amazon. The software was based on analysis of patterns in resumes that had been submitted to the company over the last 10 years. The experts found that most of those resumes were from men which is one of the prime reasons that the machine learning system apparently chose male resumes more than from the female ones. It is also a reflection of the dominance of men in the global tech industry.
 
The effect of this was that the Amazon machines – powered by artificial intelligence, assumed that it was preferable to chose male candidates. Those resumes that contained the word “women’s” was penalized by the software. Reuters also reported that graduates of two all-women’s colleges were also downgraded by the software but no names were mentioned.
 
The software has been altered by Amazon with respect to these particular terms to make it neutral. However sources reportedly also said that despite this, the company felt that it might not be enough to ensure that the machine learning system would not find out some other way to review resumes which would also be discriminatory in nature.
 
Sources also said that at the start of te current year, the team working on the project was disbanded as the company found it futile to continue the project further. The sources also said that while recruits at Amazon did consider the recommendations by the machines while looking out for candidate selection, there were other sources that they also considered before taking a final call.
 
There was no comment made by Amazon.
 
The incident as reported by Reuters illustrates one of the shortfalls of artificial intelligence systems. This incident also highlights a line of caution for large companies such as Hilton Worldwide Holdings Inc and Goldman Sachs Group Inc which are contemplating replacing some of their hiring processes with automated systems.
 
(Source:www.reuters.com)


In the same section
< >