Google Translate is Sexist. Here’s Why.

860x394.jpg
Courtesy of Nextgov

Most people inherently assume that Google, one of the world’s most reputable tech companies, is always accurately translating text from one language to another. However, many would be surprised by Google Translate’s offensive mistakes.

Word-embedding, a method of machine learning employed by Google Translate, works by linking words to a vector of numbers, which algorithms use to calculate probability. Basically, by looking at words that tend to surround other words in a sentence, the computer can figure out which ones best fit together to form a phrase. For example, if the model senses that the words “nurse” and “she” are often used in the same sentence, then based on the probability, the model will likely pair them together when translating.

When translating, it has been shown that Google Translate has a statistical bias towards male nouns and pronouns. For instance, when translating from Spanish to English, “dice” translates to “he says” despite the fact that it is a gender-neutral term.  

In Turkish, a gender-neutral language, there is only one pronoun, “o.” Unlike Spanish, French, and other languages, which have both male and female articles and endings, in Turkish, there is only one ending for singular pronouns. Whether the word is describing a “she” or a “he” or a “it,” the ending is always “o.” So, when Google Translate is translating from Turkish to a different language, it has to guess what the “o” means. It is guessing that, “he is hard working,” while “she is lazy.”  It is guessing that “he is a doctor,” while “she is a nurse.” It is guessing that “he is an engineer,” while “she is a cook.” It is guessing that “he is pessimistic,” while “she is optimistic.

When it does choose to use a female noun or pronoun, it only does so when it complies with certain gender stereotypes. If you translate the phrase, “men are men, and men should clean the kitchen” to German, it will become “Männer sind Männer, und Frauen sollten die Küche sauber,” which in English means “Men are men, and women should clean the kitchen.”

The sexism isn’t necessarily Google’s fault. In an attempt to create an automatic translator, the machine may have picked up human-like limitations and tendencies. The model itself isn’t sexist, but rather, it’s drawing from already existing biases and projecting them into the world of language.

Boasting 103 languages and around 500 million users, Google Translate is easily the world’s most popular translation tool; this is why this issue so desperately needs our attention. However, the question here is not who to blame. The question to ask is how can we stop this. By translating this way, Google is perpetuating false stereotypes and enforcing the belief that these bigoted ideas are true.

Even in 2018, all around the world, women have to fight every single day to be considered equal to men. From the enormous gender wage gap in the workplace to the sexual harassment women encounter almost everywhere at every age, enforcing these misconceptions is an added burden that we do not need. Given Google’s size and resources, it is their duty to work to amend this mistake. Solving this problem will not magically eradicate the world of all its sexism and discrimination, but it would be a start. It would be a step in the right direction. At least “she” would have a reason to remain optimistic.

 

-Shreya Joshi