Wednesday, January 16, 2019

Gender Shades




Is there AI neutrality? 

According to research by Joy Buolamwini, no! There is built in bias in the algorithms toward white males. The AI identifies white males more easily that dark skinned people, especially dark skinned women.


This is an issue when it comes to hiring qualified people. Qualifications should not be based on gender, race or sexuality. Built in biases of the algorithms used in the AI is doing exactly that.


Buolamwini thinks the biases built into AI can be corrected. According to the Forbes article where I found this, "After MIT's Buolamwini sent the results of her study to Microsoft, IBM and Face++, IBM responded by replicating her research internally, and releasing a new API, according to a conference goer who attended her presentation on Saturday."

"The updated system now classifies darker-skinned females with a success rate of 96.5%."

No comments:

Post a Comment