AI becomes biased by learning from us

AI becomes biased by learning from us
Angel of light? Google AI processes thought. (Background image: Screen grab, YouTube/ColdFusion)

[Ed. – We already knew this, but the ways in which we keep discovering it are at least entertaining.  Apparently Google Translate needs to be taken behind the barn and shot.  It’s not the only one.]

Late last year, a St. Louis tech executive named Emre Şarbak noticed something strange about Google Translate. He was translating phrases from Turkish — a language that uses a single gender-neutral pronoun “o” instead of “he” or “she.” But when he asked Google’s tool to turn the sentences into English, they seemed to read like a children’s book out of the 1950’s. The ungendered Turkish sentence “o is a nurse” would become “she is a nurse,” while “o is a doctor” would become “he is a doctor.”

The website Quartz went on to compose a sort-of poem highlighting some of these phrases; Google’s translation program decided that soldiers, doctors and entrepreneurs were men, while teachers and nurses were women. Overwhelmingly, the professions were male. Finnish and Chinese translations had similar problems of their own, Quartz noted.

What was going on? Google’s Translate tool “learns” language from an existing corpus of writing, and the writing often includes cultural patterns regarding how men and women are described.

Trending: Dems’ new secret plan to win the impeachment battle: weeping witnesses

Continue reading →