Automatically selected top physics news story of the day ( ? )
Follow us with Twitter - Facebook - RSS feed - Bookmark
Next update in ½ hour

Study finds some facial recognition systems only accurate for white male faces

Gizmag - Mon 12 Feb 18

A new study from MIT and Stanford University researchers has found that three commercial facial analysis programs demonstrated significant error rates determining the gender of any subject ...

Study finds gender and skin-type bias in commercial artificial-intelligence systems

TechXplore - Mon 12 Feb 18

Three commercially released facial-analysis programs from major technology companies demonstrate both skin-type and gender biases, according to a new paper researchers from MIT and Stanford ...

Study finds gender and skin-type bias in commercial artificial-intelligence systems, ScienceDaily - Mon 12 Feb 18
Study finds gender and skin-type bias in commercial artificial-intelligence systems, Eurekalert - Mon 12 Feb 18
Study finds gender and skin-type bias in commercial artificial-intelligence systems, Science Blog - Mon 12 Feb 18

Study finds popular face ID systems may have racial bias

Daily Mail - Mon 12 Feb 18

Researchers found that Microsoft and IBM's facial recognition systems often inaccurately identified users who were female or dark-skinned. The systems were able to identify white males with ...

AI facial analysis demonstrates both racial and gender bias

Engadget - Mon 12 Feb 18

Researchers from MIT and Stanford University found that that three different facial analysis programs demonstrate both gender and skin color biases. The full article will be presented ...

Facial recognition software is biased towards white men, researcher finds

The Verge - Sun 11 Feb 18

New research out of MIT’s Media Lab is underscoring what other experts have reported or at least suspected before: facial recognition technology is subject to biases based on the ...

Study Finds Gender, Skin-type Bias in Commercial AI Systems

Laboratory Equipment - Mon 12 Feb 18

NewsThree commercially released facial-analysis programs from major technology companies demonstrate both skin-type and gender biases, according to a new paper.Contributed Author: MITTopics: A.I./Robotics