Researchers say Amazon face-detection technology shows bias | Inquirer Technology

Researchers say Amazon face-detection technology shows bias

/ 09:17 AM January 26, 2019

NEW YORK  — Facial-detection technology that Amazon is marketing to law enforcement often misidentifies women, particularly those with darker skin, according to researchers from MIT and the University of Toronto.

Privacy and civil rights advocates have called on Amazon to stop marketing its Rekognition service because of worries about discrimination against minorities. Some Amazon investors have also asked the company to stop out of fear that it makes Amazon vulnerable to lawsuits.

The researchers said that in their tests, Amazon’s technology labeled darker-skinned women as men 31 percent of the time. Lighter-skinned women were misidentified 7 percent of the time.

Article continues after this advertisement

Darker-skinned men had a 1 percent error rate, while lighter-skinned men had none.

FEATURED STORIES

Artificial intelligence can mimic the biases of their human creators as they make their way into everyday life. The new study, released late Thursday, warns of the potential of abuse and threats to privacy and civil liberties from facial-detection technology.

Matt Wood, general manager of artificial intelligence with Amazon’s cloud-computing unit, said the study uses a “facial analysis” and not “facial recognition” technology. Wood said facial analysis “can spot faces in videos or images and assign generic attributes such as wearing glasses; recognition is a different technique by which an individual face is matched to faces in videos and images.”

Article continues after this advertisement

In a Friday post on the Medium website, MIT Media Lab researcher Joy Buolamwini responded that companies should check all systems that analyze human faces for bias.

Article continues after this advertisement

“If you sell one system that has been shown to have bias on human faces, it is doubtful your other face-based products are also completely bias free,” she wrote.

Article continues after this advertisement

Amazon’s reaction shows that it isn’t taking the “really grave concerns revealed by this study seriously,” said Jacob Snow, an attorney with the American Civil Liberties Union.

Buolamwini and Inioluwa Deborah Raji of the University of Toronto said they studied Amazon’s technology because the company has marketed it to law enforcement. Raji’s LinkedIn account says she is currently a research mentee for artificial intelligence at Google, which competes with Amazon in offering cloud-computing services.

Article continues after this advertisement

Buolamwini and Raji say Microsoft and IBM have improved their facial-recognition technology since researchers discovered similar problems in a May 2017 study. Their second study, which included Amazon, was done in August 2018. Their paper will be presented on Monday at an artificial intelligence conference in Honolulu.

Wood said Amazon has updated its technology since the study and done its own analysis with “zero false positive matches.”

Your subscription could not be saved. Please try again.
Your subscription has been successful.

Subscribe to our daily newsletter

By providing an email address. I agree to the Terms of Use and acknowledge that I have read the Privacy Policy.

Amazon’s website credits Rekognition for helping the Washington County Sheriff Office in Oregon speed up how long it took to identify suspects from hundreds of thousands of photo records.  /muf

TOPICS: Amazon, bias, face recognition, face recognition technology, law enforcement
TAGS: Amazon, bias, face recognition, face recognition technology, law enforcement

Your subscription could not be saved. Please try again.
Your subscription has been successful.

Subscribe to our newsletter!

By providing an email address. I agree to the Terms of Use and acknowledge that I have read the Privacy Policy.

© Copyright 1997-2024 INQUIRER.net | All Rights Reserved

This is an information message

We use cookies to enhance your experience. By continuing, you agree to our use of cookies. Learn more here.