Download Full-Res Image<\/a><\/button><\/div><\/div>\nSoftware may appear to operate without bias because it strictly uses computer code to reach conclusions. That\u2019s why many companies use algorithms to help weed out job applicants when hiring for a new position.<\/p>\n
But a team of computer scientists from the University of Utah, University of Arizona and Haverford College in Pennsylvania have discovered a way to find out if an algorithm used for hiring decisions, loan approvals and comparably weighty tasks could be biased like a human being.<\/p>\n
The researchers, led by Suresh Venkatasubramanian, an associate professor in the University of Utah\u2019s School of Computing, have discovered a technique to determine if such software programs discriminate unintentionally and violate the legal standards for fair access to employment, housing and other opportunities. The team also has determined a method to fix these potentially troubled algorithms. Venkatasubramanian presented his findings Aug. 12 at the 21st Association for Computing Machinery\u2019s SIGKDD Conference on Knowledge Discovery and Data Mining in Sydney, Australia.<\/p>\n
\u201cThere\u2019s a growing industry around doing r\u00e9sum\u00e9 filtering and r\u00e9sum\u00e9 scanning to look for job applicants, so there is definitely interest in this,\u201d says Venkatasubramanian. \u201cIf there are structural aspects of the testing process that would discriminate against one community just because of the nature of that community, that is unfair.\u201d<\/p>\n
Machine-learning algorithms<\/strong><\/p>\nMany companies have been using algorithms in software programs to help filter out job applicants in the hiring process, typically because it can be overwhelming to sort through the applications manually if many apply for the same job. A program can do that instead by scanning r\u00e9sum\u00e9s and searching for keywords or numbers (such as school grade point averages) and then assigning an overall score to the applicant.<\/p>\n
These programs also can learn as they analyze more data. Known as machine-learning algorithms, they can change and adapt like humans so they can better predict outcomes. Amazon uses similar algorithms so they can learn the buying habits of customers or more accurately target ads, and Netflix uses them so they can learn the movie tastes of users when recommending new viewing choices.<\/p>\n
But there has been a growing debate on whether machine-learning algorithms can introduce unintentional bias much like humans do.<\/p>\n
\u201cThe irony is that the more we design artificial intelligence technology that successfully mimics humans, the more that A.I. is learning in a way that we do, with all of our biases and limitations,\u201d Venkatasubramanian says.<\/p>\n
Disparate impact<\/strong><\/p>\nVenkatasubramanian\u2019s research determines if these software algorithms can be biased through the legal definition of disparate impact, a theory in U.S. anti-discrimination law that says a policy may be considered discriminatory if it has an adverse impact on any group based on race, religion, gender, sexual orientation or other protected status.<\/p>\n
Venkatasubramanian\u2019s research revealed that you can use a test to determine if the algorithm in question is possibly biased. If the test \u2014 which ironically uses another machine-learning algorithm \u2014\u00a0can accurately predict a person\u2019s race or gender based on the data being analyzed, even though race or gender is hidden from the data, then there is a potential problem for bias based on the definition of disparate impact.<\/p>\n
\u201cI\u2019m not saying it\u2019s doing it, but I\u2019m saying there is at least a potential for there to be a problem,\u201d Venkatasubramanian says.<\/p>\n
If the test reveals a possible problem, Venkatasubramanian says it\u2019s easy to fix. All you have to do is redistribute the data that is being analyzed \u2014 say the information of the job applicants \u2014 so it will prevent the algorithm from seeing the information that can be used to create the bias.<\/p>\n
\u201cIt would be ambitious and wonderful if what we did directly fed into better ways of doing hiring practices. But right now it\u2019s a proof of concept,\u201d Venkatasubramanian says.<\/p>\n
In addition to Venkatasubramanian, the research also was conducted by University of Arizona Computer Science assistant professor Carlos Scheidegger, Haverford College Computer Science assistant professor Sorelle Friedler, University of Utah doctoral student John Moeller and Haverford undergraduate student Michael Feldman.<\/p>\n","protected":false},"excerpt":{"rendered":"
Software may appear to operate without bias because it strictly uses computer code to reach conclusions. That\u2019s why many companies use algorithms to help weed out job applicants when hiring for a new position. But a team of computer scientists from the University of Utah, University of Arizona and Haverford College in Pennsylvania have discovered […]<\/p>\n","protected":false},"author":19,"featured_media":516,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[14,9,8],"tags":[45,44,43,41,42],"class_list":["post-512","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-research","category-science-technology","category-ubn","tag-computer-science","tag-engineering","tag-research","tag-science","tag-technology"],"_links":{"self":[{"href":"https:\/\/stage.unews.umc.utah.edu\/wp-json\/wp\/v2\/posts\/512"}],"collection":[{"href":"https:\/\/stage.unews.umc.utah.edu\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/stage.unews.umc.utah.edu\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/stage.unews.umc.utah.edu\/wp-json\/wp\/v2\/users\/19"}],"replies":[{"embeddable":true,"href":"https:\/\/stage.unews.umc.utah.edu\/wp-json\/wp\/v2\/comments?post=512"}],"version-history":[{"count":5,"href":"https:\/\/stage.unews.umc.utah.edu\/wp-json\/wp\/v2\/posts\/512\/revisions"}],"predecessor-version":[{"id":521,"href":"https:\/\/stage.unews.umc.utah.edu\/wp-json\/wp\/v2\/posts\/512\/revisions\/521"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/stage.unews.umc.utah.edu\/wp-json\/wp\/v2\/media\/516"}],"wp:attachment":[{"href":"https:\/\/stage.unews.umc.utah.edu\/wp-json\/wp\/v2\/media?parent=512"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/stage.unews.umc.utah.edu\/wp-json\/wp\/v2\/categories?post=512"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/stage.unews.umc.utah.edu\/wp-json\/wp\/v2\/tags?post=512"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}