Where Industry Meets Innovation

  • Contact Us
  • sign in Sign In
  • Sign in with certificate
mit campus

Resources

Search News

  • View All
  • ILP News
  • MIT Research News
  • MIT Sloan Management Review
  • Technology Review
  • Startup Exchange

MIT Research News

February 11, 2018

Study finds bias in commercial AI systems

Examination of facial-analysis software shows error rate of 0.8 percent for light-skinned men, 34.7 percent for dark-skinned women.

Larry Hardesty | MIT News Office

Three commercially released facial-analysis programs from major technology companies demonstrate both skin-type and gender biases, according to a new paper researchers from MIT and Stanford University will present later this month at the Conference on Fairness, Accountability, and Transparency.