In general, AI advances are good for our society. In particular cases, they can be bad. Take Amazon’s Rekognition AI service. There are evidences that the service exhibited much higher error rates on images of darker-skinned women versus lighter-skinned men. Our AIWS Weekly Newsletter last month discussed the controversies over Rekognition AI, Amazon’s facial recognition software. Notably, in March, a group of 26 prominent research scientists, including Dr. Yoshua Bengio, this year’s winner of the Turing Award (the Nobel Prize equivalence in the field of Computing), called for the company to stop selling Rekognition AI to police departments.
The Washingpost just published a detailed post on this matter. It started in late 2017 when the Washington County Sheriff’s Office in Oregon became the first law enforcement agency to test Rekognition. Almost overnight, deputies saw their investigative powers supercharged. But what if Rekognition gets it wrong? Earlier, after inquiries from the Post, Amazon updated its guidelines for law enforcement to advise officers to manually review all matches before detaining a suspect.
According to the Post, Amazon executives say they support national facial-recognition legislation, but arguing that “new technology should not be banned or condemned because of its potential misuse.” On the other side, “people love to always say, ‘Hey, if it’s catching bad people, great, who cares,’ ” said Joshua Crowther, a chief deputy defender in Oregon, “until they’re on the other end.”
The question whether we should use Rekognition because of its value knowing it is not perfect does have a moral impact. This example of an AI service with potential bias highlights the importance of an ethical framework in the development and use of AI. This is exactly the topic of a roundtable hosted by the Artificial Intelligence World Society (AIWS) in Tokyo in March 2019. We believe that regulation from the government level is needed to avoid any broad release of AI software that may have bias against any population.