THE ALGORITHMIC WHITE GUY PROBLEM is getting worse..

In 2015 the first woman who appeared in a Google Image search for 'CEO' was Barbie. More than 7 years later the problems with biases in technology has not improved

"Machine learning is great if you’re using it to work out the best way to route an oil pipeline,” Noel Sharkey, emeritus professor of robotics and AI at the University of Sheffield emphasises in an interview with The Guardian; “Until we know more about how biases work in them, I’d be very concerned about them making predictions that affect people’s lives.”

Facial recognition, voice recognition as well as machine translation - machine learning software -  that learn from large data sets pick up inherent social biases. Google Translate e.g. defaults the masculine pronoun; repedingly changing «she said» to «he said». As Stefaan Verhulst, Co-founder and Chief Research and Development Officer, puts it: “We wondered why this happened and we found out, because Google Translate works on an algorithm, the problem is that «he said» appears on the web four times more than «she said», so the machine gets it right if it chooses «he said». He also found that there was a huge change in English language from 1968 to the current time, and the proportion of «he said» and «she said» changed from 4-to-1 to 2-to-1. “But, still, the translation does not take this into account. So we went to Google and we said «Hey, what is going on?» and they said «Oh, wow, we didn’t know, we had no idea!». So what we recognized is that there is an unconscious gender bias in the Google algorithm. They did not intend to do this at all, so now there are a lot of people who are trying to fix it….”

As MIT grad student Joy Buolamwini states in her talk: “Algorithmic bias, like human bias, results in unfairness. However, algorithms, like viruses, can spread bias on a massive scale at a rapid pace. Algorithmic bias can also lead to exclusionary experiences and discriminatory practices.” She quotes data scientist Cathy O'Neil, who in in her book, "Weapons of Math Destruction," talks about the rising new WMDs — widespread, mysterious and destructive algorithms that are increasingly being used to make decisions that impact more aspects of our lives. “So who gets hired or fired? Do you get that loan? Do you get insurance? Are you admitted into the college you wanted to get into? Do you and I pay the same price for the same product purchased on the same platform?”

Algorithmic bias can lead to discriminatory practices.

  1. New Yorkers living in areas at greater risk of stop-and-frisk by police are also more exposed to invasive facial recognition technology, new research by Amnesty International and partners has revealed“Our analysis shows that the NYPD’s use of facial recognition technology helps to reinforce discriminatory policing against minority communities in New York City,” said Matt Mahmoudi, Artificial Intelligence and Human Rights Researcher at Amnesty International.

  2. Driverless cars: AI systems are consistently better at identifying people with lighter skin tones than darker. According to The Guardian, not by a little bit: one headline comparison suggests that a white person was 10% more likely to be correctly identified as a pedestrian than a black person.

  3. Facial-recognition systems regularly identifying white faces much faster, and more frequently, than Black female faces. 

  4. Female historians and male nurses do still not exist in Google Translate.

  5. Biased Google Search Results Affect Hiring Decisions. Research from New York University highlights how harmful such biased search results can be. The study reveals that even gender-neutral internet searchers can still often yield male-dominated results, which can have a significant impact on hiring decisions and help to propagate gender biases.

  6. Researchers from the Swiss Federal Institute of Technology in Zurich, Switzerland, estimate that, as a result of gender bias, papers whose first authors are women receive around 10% fewer citations than do those that are first-authored by men.

  7. According to Rachael Tatman, linguist researcher and National Science Foundation Graduate Research Fellow at the University of Washington, Google’s speech recognition software has gender bias.

  8. Google image searches for “CEO” is an infinite scroll of white peope, mostly men. A team of researchers from Carnegie Mellon University claim that Google's algorithm shows prestigious job ads to men, but not to women. “I think our findings suggest that there are parts of the ad ecosystem where kinds of discrimination are beginning to emerge and there is a lack of transparency,” Carnegie Mellon professor Annupam Datta told Technology Review. “This is concerning from a societal standpoint.”

  9. New research shows that subtle gender bias is entrenched in the data sets used to teach language skills to AI programs. Researchers from Boston University and Microsoft Research New England found that the data sets considered the word “programmer” closer to the word “man” than “woman,” and that the most similar word for “woman” is “homemaker.”

  10. Machine translations as Google Translate and Systran defaults the masculine pronoun. It defaults to «he said» instead of «she said»

Previous
Previous

VIRAL PRANKS SELL

Next
Next

remember the worst year ever? in memes…