Researchers Say Autonomous Vehicles Could Be Racist

Researchers Say Autonomous Vehicles Could Be Racist

March 22, 2019
Study finds object detection systems are poorer at detecting pedestrians with darker skin types.

A myriad of technological challenges have raised roadblocks to public acceptance of autonomous vehicles, but who could have guessed that one of them could turn out to be the perceived racism of pedestrian detection systems?

That was the danger raised by researchers Benjamin Wilson, Judy Hoffman and Jamie Morgenstern of the Georgia Institute of Technology who say that people with darker skin are more likely to be hit by autonomous vehicles than people with lighter skin.

According to their findings, object detection systems of the kinds that are used in autonomous vehicles had uniformly poorer performance when detecting pedestrians with darker skin types. Seeking alternative explanations for their results, they found that neither time of day nor occlusion (something blocking the camera lens) explain this result.

The study used what is called the Fitzpatrick skin type scale, which was introduced to measure a number of physical attributes (such as hair and eye color, in addition to a person’s likelihood to freckle, burn, or tan when exposed to UV light). The scale’s six categories were assigned to one of two groups: one for the lighter and one for the darker skin tones.

In the Georgia Tech study, pedestrian detection in road scenes was tested for the two groups and showed a lower rate of detection for the dark-skinned group. On average, they found that these systems were 5% less accurate at detecting people with darker skin

This is not the first time that image recognition systems have demonstrated higher accuracy for whites, point out Michael R. Greco and Susan M. Schaecher, attorneys with the law firm of Fisher Phillips.

For example, last year, Amazon’s facial recognition system was tested by the American Civil Liberties Union (ACLU) and incorrectly matched 28 of the 535 members of Congress to mugshots of arrestees. The ACLU reported that “Nearly 40% of the false matches were people of color, even though they make up only 20% of Congress.”

Other kinds of technologies have proven to be problematical as well, including facial recognition systems that don’t recognize non-white skin to cameras that tell Asian people to stop blinking. In one case automatic soap dispensers in restrooms wouldn’t provide soap to blacks. The recognition problems are not just confined to visual systems, either. Voice recognition systems have had trouble recognizing women's voices more than men's.

Employers Also at Risk

It was these sorts of cases that provoked the researchers to conduct their study, noting that "many recent examples of machine learning and vision systems displaying higher error rates for certain demographic groups than others." They point out that a "few autonomous vehicle systems already on the road have shown an inability to entirely mitigate risks of pedestrian fatalities," and recognizing pedestrians is key to avoiding deaths.

"We hope this study provides compelling evidence of the real problem that may arise if this source of capture bias is not considered before deploying these sort of recognition models," the researchers concluded.

The Fisher Phillips lawyers note, “Some have suggested that if the engineers creating the datasets are not diverse, they tend to select images most like themselves and fail to recognize disparate representation in datasets when it occurs. This unconscious bias might be corrected through increasing the number of women and minorities in STEM jobs and, in the nearer term, testing explicitly for bias.”

They add that employers are increasingly facing claims of unconscious bias. “Where the workforce is not diverse, there is a risk that decisions may be made on, or influenced by, unconscious bias. Plaintiffs’ lawyers frequently argue that seemingly neutral rules or practices can have a disparate impact on a protected group. You must not only strive to diversify your workforce but be on guard against implicit bias.”

The attorneys’ advice:

  • Raise managers’ awareness of implicit bias through surveys and tests, which human resource consulting companies often provide.
  • Offer diversity training for managers and employees, as it can further awareness of implicit bias and how it can affect workplace decisions.
  • Establish objective criteria and unambiguous, consistent procedures for use when making employed-related decisions.
  • After giving managers these tools, you should assess their performance and hold them accountable for hiring, performance evaluation, and other employment decisions.

“Failing to follow these steps in the employment setting may not endanger life and limb, but it can certainly increase financial and personal risks,” they stress.

About the Author

David Sparkman | founding editor

David Sparkman is founding editor of ACWI Advance (www.acwi.org), the newsletter of the American Chain of Warehouses Inc. He also heads David Sparkman Consulting, a Washington D.C. area public relations and communications firm. Prior to these he was director of industry relations for the International Warehouse Logistics Association.  Sparkman has also been a freelance writer, specializing in logistics and freight transportation. He has served as vice president of communications for the American Moving and Storage Association, director of communications for the National Private Truck Council, and for two decades with American Trucking Associations on its weekly newspaper, Transport Topics.

Latest from Technology & Automation

#336390860@Emilyprofamily|Dreamstime
Rising Demand for Real-Time Inventory Tracking
#139681214@Alexandersikov|Dreamstime
48% of Digital Initiatives Meet  Business Goals