An anonymous reader quotes a report from The New York Times: Over the last two years, Amazon has aggressively marketed its Facial Recognition technology to police departments and federal agencies as a service to help law enforcement identify suspects more quickly. Now a new study from researchers at the M.I.T. Media Lab has found that Amazon's system, Rekognition, had much more difficulty in telling the gender of female faces and of darker-skinned faces in photos than similar services from IBM and Microsoft. The results raise questions about potential bias that could hamper Amazon's drive to popularize the technology. In the study, published Thursday, Rekognition made no errors in recognizing the gender of lighter-skinned men. But it misclassified women as men 19 percent of the time, the researchers said, and mistook darker-skinned women for men 31 percent of the time. Microsoft's technology mistook darker-skinned women for men just 1.5 percent of the time. For the latest study, [co-author of the study, Ms. Buolamwini, said] she sent a letter with some preliminary results to Amazon seven months ago. But she said that she hadn't heard back from Amazon, and that when she and a co-author retested the company's product a couple of months later, it had not improved. "It's not possible to draw a conclusion on the accuracy of facial recognition for any use case -- including law enforcement -- based on results obtained using facial analysis," Matt Wood, general manager of AI at Amazon Web Services, said. He added that the researchers had not tested the latest version of Rekognition, which was updated in November. "Amazon said that in recent internal tests using an updated version of its service, the company found no difference in accuracy in classifying gender across all ethnicities," the NYT reports. The new study is scheduled to be presented Monday at an artificial intelligence and ethics conference in Honolulu.
Read more of this story at Slashdot.