The gender pay gap and women’s representation in leadership roles continues to captivate headlines but what action is really being taken… It’s time to take a journey through the application process for a young adult female, let’s call her Mira, looking to land an interview in a Science, Technology, Engineering or Math (STEM) focused corporation…
A large part of Mira’s job search is online, where she will turn to various social platforms to seek out new opportunities, and depending on the channel she picks, she will be shown job advertisements that ultimately will be based on biased pay per click (PPC) and purchasing algorithms.
Advertising a STEM job across various social networks reached 20% fewer females than males, even though it is illegal to gender bias in both the US and Europe.Management Science Study
Since advertising algorithms are designed to optimise advertising spend, and maximise reach, less potential female candidates were exposed to the ad and therefore did not get the opportunity to apply. To put it simply, the algorithms are biased against female candidates due to the higher cost of reaching them, even if ultimately the female candidate would be a better hire and take less resources to train.
Despite all of these challenges and biases facing her, Mira chugs along and finds a job posting in her field that captures her attention. Upon reading further into the job and the company, she is both consciously and unconsciously influenced by the language used to describe the role which will determine her next step. The STEM industry, particularly, is imbalanced due to a lack of women being trained and therefore is an industry that remains largely male-dominated. This transfers into the biased language used in the industries’ job listings, which in turn biases the data the job board algorithms are trained on.
A University of Waterloo and Duke University study showed that male-dominated industries (STEM industries) use masculine-themed words (like “leader”, “competitive” and “dominant”) which when interpreted by female applicants deteriorated their perception of the “job’s appeal” and their level of “belongingness” even if they felt able to perform the job. Above this, it is proven that a female will only apply for a job if she fulfills 100% of the criteria, whereas males will apply if they feel they fulfill only 60%.
Determined as ever, Mira eventually finds a job description and company she feels confident about, she submits an application. Her CV and cover letter are parsed and ranked alongside other applicants, male and female. Each success factor identified within the words Mira has used, like her programming skills, is weighted according to what has been historically successful in the past for that particular company.
In the age of Artificial Intelligence, past hiring decisions are used to train algorithms to identify the best suited candidates for the company. The issue with biased data is that even if gender is excluded from the application itself, the gender distribution of the training data may be strongly imbalanced since the industry has been historically male-dominated. This means that even if Mira gets to the point where the hiring company or job board puts her resume through their fit algorithm, she still may not receive the interview based on the inherent bias in the program.
If you’re not convinced by Mira’s journey, here’s a real life example: Amazon’s experimental hiring tool was used to screen and rate resumes based on historical hiring patterns. To date, a majority of Amazon employees were males and inevitably the system taught itself to penalise female resumes, with far greater efficiency than a human.
Still unsure if data bias is perpetuating gender bias in STEM? Check out these articles from others in the industry:
How Unconscious Bias Holds Women Back
AI Is Demonstrating Gender Bias
Leave a Reply