Amazon Facial Recognition Software Shows Gender and Racial Bias

AmazonAccording to a study published by the MIT Media Lab on Jan. 24, 2019, Amazon’s facial technology makes more mistakes differentiating between men and women, and has a more difficult time recognizing the gender of women who are dark-skinned than compatible software from IBM and Microsoft.

The Rekognition software from Amazon inaccurately recognized women as men 19 percent of the time, the study reported. Additionally, it incorrectly identified women with darker skin 31 percent of the time. In comparison, Microsoft software identified women with darker skin than men 1.5 percent of the time.

Amazon Web Services General Manager of Artificial Intelligence Matt Wood said the MIT study was based upon facial analysis, not facial recognition. He said facial analysis is different from facial recognition.  Analysis has the ability to find faces in images and videos and can account for cosmetic changes, such as the individual wearing glasses or jewelry. On the other hand, recognition matches a person’s face to images in photographs and videos. He stressed the Amazon Rekognition software includes both of these functions.

Deborah Raji coauthored the study with Joy Buolamwini. Raji said that she understood the distinction between facial analysis and recognition. The author continued by saying the research study made clear the method used was the facial analysis assignment of gender classification. Raji explained that the analysis was based on the number of faces that were detected, and how well the software understood what it saw.

In a blog post published on Jan. 25 on the Medium blog, Buolamwini forewarned individuals to be skeptical of any company who says it has completely accurate software. She continued by saying Amazon reports it has used more than 1 million faces to test the capabilities of facial recognition with their software, and it performed extremely well. She added, the skin type or demographics of those used by Amazon in its test was unknown. Without having this information, it is impossible to determine whether there is a bias based on race or gender.

Amazon has supplied its Rekognition software to various law enforcement organizations. However, members of Congress, civil liberties groups, and employees from Amazon raised concerns regarding privacy.  Amazon shareholders have requested that the company stops selling Rekognition to government agencies.

Buolamwini said that based on the conclusions of the MIT study,  Amazon should stop selling its software to government agencies and is irresponsible for doing so. She said facial analysis technology can be easily abused and can be used for mass surveillance. Buolamwini said that any inaccuracies which may occur could result in innocent people being charged with crimes they did not commit, having been incorrectly identified as a criminal.

Jacob Snow, an attorney with the American Civil Liberties Union (ACLU), said that Amazon’s reaction to the study is concerning and suggests that they are not taking it seriously.

Amazon’s website gives credit to Rekognition for enabling the Washington County Sheriff Office in Oregon to identify suspects from thousands of photos more quickly.

The most recent MIT study follows one Buolamwini conducted in Feb. 2018. That study identified similar gender and racial biases in facial analysis software made by a Chinese firm Megvii, Microsoft, and IBM. After the results were shared, IBM and Microsoft said they would re-examine their software and make improvements.  The most recent study shows they have made those improvements.

To provide extra safety, IBM made their curated dataset public which would help improve their accuracy. Microsoft has called for regulation of facial recognition technology to ensure the public’s safety.

Written by Barbara Sobel

Sources:

The Verge: Gender and racial bias found in Amazon’s facial recognition technology (again)
CBS News: Amazon face-detection technology shows gender and racial bias, researchers say
Washington Post: Amazon facial-identification software used by police falls short on tests for accuracy and bias, new research finds

Featured and Top Image Courtesy of Send Me Adrift’s Flickr Page – Creative Commons License

3 Responses to "Amazon Facial Recognition Software Shows Gender and Racial Bias"

  1. Teri Burri   March 3, 2019 at 3:46 am

    thanks for sharing this information.have shared this link with others keep posting such information..

    Reply
  2. Margorie Demello   March 3, 2019 at 3:44 am

    thanks for sharing this information.have shared this link with others keep posting such information..

    Reply
  3. Aleisha Rendall   March 1, 2019 at 1:10 am

    thanks for sharing this information.have shared this link with others keep posting such information..

    Reply

Leave a Reply

Your email address will not be published.