Home

Headlines Weather Search

News

Sports

Business

World

Radio

Radio Fiji One Mirchi FM Gold FM Bula FM 2Day FM Radio Fiji Two

TV

FBC TV FBC Sports FBC 2

World

Amazon: Facial recognition bias claims are 'misleading'

February 5, 2019 5:31 am

Amazon has defended its facial-recognition tool, Rekognition, against claims of racial and gender bias, following a study published by the Massachusetts Institute of Technology.

The researchers compared tools from five companies, including Microsoft and IBM.

While none was 100% accurate, it found that Amazon’s Rekognition tool performed the worst when it came to recognising women with darker skin.

Amazon said the study was “misleading”.

The study found that Amazon had an error rate of 31% when identifying the gender of images of women with dark skin.

This compared with a 22.5% rate from Kairos, which offers a rival commercial product, and a 17% rate from IBM.

By contrast Amazon, Microsoft and Kairos all successfully identified images of light-skinned men 100% of the time.