Self-driving cars risk ‘future errors’ due to difficulty detecting darker skin tones: Researchers
Systems used by self-driving cars to avoid pedestrians may be less accurate at detecting people with darker skin, a report out of the Georgia Institute of Technology warned recently.
Researchers published their findings in a report, “Predictive Inequity in Object Detection,” after testing several similar systems to determine how accurately they spot people of varying skin tones.
Using a collection of digital photographs showing street scenes with pedestrians, the researchers grouped each person pictured in the dataset by their placement on the Fitzpatrick scale a classification schema that measures skin tone on a scale of one to six from lightest to darkest and then gauged how well the object-detection systems spotted people lumped on either side of the spectrum.
Summarizing the results, the researchers wrote that their findings “show uniformly poorer performance of these systems when detecting pedestrians with Fitzpatrick skin types between 4 and 6.”
“This behavior suggests that future errors made by autonomous vehicles may not be evenly distributed across different demographic groups,” the researchers wrote.
On average, detection was five percentage point less accurate among individuals on the darker side of the scale, Vox reported Wednesday.
“The main takeaway from our work is that vision systems that share common structures to the ones we tested should be looked at more closely,” Jamie Morgenstern, one of the authors of the study, told the website.
Vox warned the report “should be taken with a grain of salt,” however. It has not been peer-reviewed, and the researchers did not test any object-detection currently being used by any self-driving cars on the road, the website noted.
An Arizona woman was killed after being struck by a self-driving Uber in March 2018, marking what was reported at the time as the first fatal crash in the United States involving an autonomous vehicle and pedestrian. The ride-share company subsequently suspended its self-driving car tests in several markets as a result.
“The few autonomous vehicle systems already on the road have shown an inability to entirely mitigate risks of pedestrian fatalities,” the GIT researchers wrote. “A natural question to ask is which pedestrians these systems detect with lower fidelity, and why they display this behavior”
Rep. Alexandria Ocasio-Cortez, New York Democrat, faced criticism earlier this year for suggesting computer algorithms could have biases, the IFL Science website recalled this week.
“Algorithms are still made by human beings, and those algorithms are still pegged to basic human assumptions,” the freshman congresswoman warned in January. “They’re just automated assumptions. And if you don’t fix the bias, then you are just automating the bias.”