Why are algorithms sexist?



Nothing can describe my state of mind when, for the fiftieth time, my beloved translation software persisted in putting into masculine form the beautiful feminine phrasing that I asked it to translate into English.


My interlocutor, the program, made himself comfortable using all the unisex grammatical latitude that Shakespeare's language gives him to surreptitiously conceal my " elle " or " sa" in " he " or " his ".


So, I wanted to know more: why is it that even in a machine, the masculine gender spontaneously predominates? It's the algorithms, I thought. But I was also thinking deep inside that these algorithms, which I visualized as an unintelligible mathematical sequence, couldn't have a gender preference. So, like any self-respecting curious person, I went to find out.


Point 1: The algorithm for dummies.


The algorithm is a kind of step-by-step instruction manual to reach a given goal. The comparison most often suggested is that of the recipe. For example, if your goal is to make a pizza, then your algorithmic recipe will be: take flour, then water, then olive oil, then tomato sauce etc .. the steps leading up to putting the shaped pizza in the oven at 180 degrees.


The matter becomes more complicated when there are several possible options. If at the moment of putting the topping on the pizza there are several possibilities, the algorithm will be able to introduce, in the form of a tree, the variables according to what you have in the fridge. For example, in your recipe, at the " choice of topping " step, the algorithm could define that if there are mushrooms, you put them on. If there are no mushrooms, but there is ham, then you put ham on. If there are no mushrooms or ham, then you put nothing but tomato sauce. If there are mushrooms and ham, the algorithm will be able to look at your choices in the past and decide for itself what it will offer you, based on them. That's how you move from the autonomous car to medical diagnosis through artificial intelligence or real-time analysis of the financial markets.


Of course, behind this process, there is a human being who translates this recipe into terms that a machine can understand, and that's where the problem lies.


Flora Vincent and Aude Bernheim, in "Artificial Intelligence, not without them! "published in 2019, point out that AI reflects the society in which we live and its biases, Aude Bernheim takes precisely the example of translation software.


Hallelujah! I wasn't completely wrong when I wondered. Thus, "the doctor" is always " le docteur " and "the nurse" is always l'infirmière.


You'll tell me it's quite anecdotal, but it's not so much so when facial recognition software trains mostly on images of white men or when the algorithms of some AIs have learned to associate the female gender with the kitchen area of a house through thousands of images of women in this type of room. The biases that have been experienced to date are not only replicated, but also and above all amplified, as Hannah Kuchler notes in an article in the Financial Times of March 9, 2018, "Tech's sexist algorithms and how to fix them".


Even more delicate is the algorithm that determines a lower salary range for women, based on the statistics of what a woman in this or that position has earned in the past.


In the labor field, one of the glaring examples is Amazon's hiring system, which simply had to be shelved when it proved sexist and discriminatory because it was based only on male resumes.


The danger of an exponential increase in inequality is very real. Indeed, if I take the example of my pizza from earlier, let's imagine that my algorithm detects that I have eaten mushrooms in the majority of cases, it might no longer offer me a ham pizza and confine me to an eternity of mushrooms.


Phew! The evil is much more widespread than I thought. I've been overwhelmed with articles on the subject, so I'm surprised that a solution hasn't yet been found.


Apparently, according to Flora Vincent, the geek universe is still predominantly male, with the girls themselves considering it to be elitist, difficult for them to access and self-censoring.


But why girls?


Joy Adowaa Buolamwini, a computer scientist at MIT, has been fighting for years against prejudices in software decision-making processes. She is the founder of the Algorithmic Justice League and provides solutions to overcome what should not be a foregone conclusion.


Moreover, legally, who would be responsible for discriminations and prejudices propagated by an AI? The coder of the algorithm, the author of the source database, or the user?


From the stupid statement I made, I opened a Pandora's box I didn't expect. Indeed, naively, I told myself that in the age of technology, where the machine has neither mood nor gender, these shortcomings would gradually disappear.


Convinced that we are all trapped in our mental algorithms, women as well as men, there is only one way to change things and that is to talk about it and talk about it again, and to remember that the first person to have coded an algorithm in the 19th century was called Ada Lovelace. And, that she preceded illustrious scientists like Ida Rhodes, who designed the C-10 computer language in collaboration with Betty Holberton in the early 1950s; Grace Hopper, who played a pivotal role in the development of the COBOL language; Fran Allen, the first woman to earn the title of IBM Fellow and the ACM Turing Award for "significant and lasting contributions to the field of computer science”; Karen Spärck Jones, the originator of the concept of ‘reverse document frequency’, the technology behind modern search engines such as Google; Adele Goldberg, creator of the Smalltalk-80 language, which was integrated by Apple in the programming configuration of the first Macintosh model; Rosalind Picard, Professor at MIT Media Lab, where she leads research in affective computing; Nnenna Nwakanma who campaigns for affordable internet access through her Free Software and Open Source Foundation for Africa, FOSSFA; Indian AI specialist Neelam Dhawan, ranked by Forbes magazine as one of the most influential women in the world; and, Hedy Lamarr, a well-known actress and inventor of genius.


Ladies, I would like to pay tribute to you here.



picture by Ali Pazani


©2020 par Empowerment foundation