According to a new paper from the Georgia Institute of Technology, autonomous cars could disproportionately endanger Pedestrians with Darker Skin, a troubling sign of how AI can inadvertently reproduce prejudices from the wider world. Futurism reports: [In the paper, the researchers] detail their investigation of eight AI models used in state-of-the-art object detection systems. These are the systems that allow autonomous vehicles to recognize road signs, pedestrians, and other objects. They tested these models using images of pedestrians divided into two categories based on their score on the Fitzpatrick scale, which is commonly used to classify human skin color. According to the researchers' paper, the models exhibited "uniformly poorer performance" when confronted with pedestrians with the three darkest shades on the scale. On average, the models' accuracy decreased by 5 percent when examining the group containing images of pedestrians with darker skin tones, even when the researchers accounted for variables such as whether the photo was taken during the day or at night. Thankfully, the researchers were able to figure out what was needed to avoid a future of biased self-driving cars: start including more images of dark-skinned pedestrians in the data sets the systems train on and place more weight on accurately detecting those images.
Read more of this story at Slashdot.