Home » News and Events » News » Amazon Schooled on AI Facial Technology By Turing Award Winner

Amazon Schooled on AI Facial Technology By Turing Award Winner

In general, AI advances are good for our society. In particular cases, they can be bad. Take Amazon’s Rekognition AI service. There are evidences that the service exhibited much higher error rates on images of darker-skinned women versus lighter-skinned men. Bloomberg reported that the dispute over Amazon’s software has been raging for almost a year. The American Civil Liberties Union said a test it ran of Rekognition AI falsely matched 28 members of Congress with mugshots and that those flagged were disproportionately people of color.

This gender and racial bias led to the call last Wednesday by 26 researchers, including Dr. Yoshua Bengio, this year’s winner of the Turing Award (the Nobel Prize equivalence in the field of Computing), for the company to stop selling Rekognition AI to police departments.

Although earlier, two Amazon officials, Matthew Wood and Michael Punke, responded to defend the fairness and correctness of the underlying algorithm, the concerned researchers argued that Amazon misrepresented the technical details for the work and the state-of-the-art in facial analysis and face recognition.

Amazon response is on the basis that the original study used to point out the bias of its AI software is not “reproducible”, questioning the correctness and generalizability of this study. In contrast, the researchers noted that IBM and Microsoft were able to reproduce the results and use them to improve their facial-recognition systems.  We are still waiting to hear Amazon’s comments on this rebuttal.

This example of an AI service with potential bias highlights the importance of an ethical framework in the development and use of AI. This is exactly the topic of a roundtable hosted by the Artificial Intelligence World Society (AIWS) in Tokyo last month.  We believe that regulation from the government level is needed to avoid any broad release of AI software that may have bias against any population.