Search

Gender Bias in AI and Machine Learning




AI and Machine Learning algorithms have a reputation for being calculated and analytical, which would imply that they are prone to the same emotional biases that humans have. Unfortunately, this is not the case as AI algorithms are designed by human beings and they can have those same biases built into their calculations. In this article we’re going to talk about the idea of gender bias in AI and machine learning: Where AI Gender Bias comes from AI systems are biased because they are created by humans who themselves have these biases in their thought process. The first thing you want to consider is who are the people designing the AIs and making decisions in their production. For the most part this is going to be men, only about 22% of professionals in AI and data science fields are women and even those that do have jobs tend to be at lower status positions and therefore they aren’t making a lot of the decisions that drive these projects. These professionals are responsible for generating, collecting and labelling the data that goes into the AI that allows it to learn. Since women are underrepresented it’s possible for biases to be introduced, whether intentionally or not.


Next we need to consider that the data available to AI professionals is in itself an overrepresentation of men. There are over 300 million fewer women than men across the internet and in low-middle income countries women are 20% less likely to own smartphones than men. This means a lot of the information that is collected from public sources on the internet such as social media underrepresents women and therefore is less likely to properly classify them when that information is used to generate AI algorithms.


Lastly, data isn’t usually segregated based on sex or gender (among other factors) and this blurs the data and makes it less likely that an AI algorithm will be able to identify the key differences between the different genders. It also further hides any over or under representation of either gender.


Examples of AI Gender Bias


One example of how this AI gender bias can be negative was found in the credit industry, early algorithms used marital status and gender to determine creditworthiness and this resulted in women receiving lower credit limits, fortunately this issue has been corrected today. Also the gender shades research project found that systems misclassified women far more often than men, in particular darker-skinned women were misclassified at a rate of 35% compared to an error rate of 0.8% for lighter skinned men.


How to get more free content


If you like this article and would like to read more of our content for cybersecurity insights, tips and tricks feel free to follow us on our social media. If you’re a struggling business owner who needs help in assessing their business’s cybersecurity posture feel free to take advantage of our free introductory assessment and we’ll help you figure out a game plan for keeping your company safe.

9 views