The new project, as described to Reuters, is purportedly trying to improve facial recognition software and determine if many of the programs available on the market today are “biased” against persons of color. If you find yourself thinking that a software program is incapable of secretly harboring a racist agenda, I would definitely agree with you. But that doesn’t mean that the algorithms can’t be seriously flawed and produce spectacularly incorrect results when broken down along racial lines. Assuming they manage to pull this off, however, how long do you think it will be before this new technology is unleashed in the field of “racial justice” applications?

“Google told Reuters this week it is developing an alternative to the industry standard method for classifying skin tones, which a growing chorus of technology researchers and dermatologists says is inadequate for assessing whether products are biased against people of color.

At issue is a six-color scale known as Fitzpatrick Skin Type (FST), which dermatologists have used since the 1970s. Tech companies now rely on it to categorize people and measure whether products such as facial recognition systems or smartwatch heart-rate sensors perform equally well across skin tones. read more

Critics say FST, which includes four categories for “white” skin and one apiece for “black” and “brown,” disregards diversity among people of color.”

In the past, we’ve examined some of the glaring flaws in facial recognition software in the early years of its development. Amazon’s Rekognition software has been hilariously bad at its job in terms of racial benchmarks. The first release, when undergoing public testing, was able to identify white males correctly 100% of the time, with the success rate for Hispanic males still being over 90%. But it couldn’t pick out white females in a larger number of cases, misidentifying them as males 7% of the time. When asked to identify Black females, the success rate was well below half and in almost one-third of examples, it identified them as men.

This led to some results that were both amusing and disturbing. When the ACLU tested the software by scanning the images of all of California’s legislators and comparing them to a database of tens of thousands of mugshots, it identified more than two dozen of the (mostly non-white) elected officials as criminals. Of course, this is California we’re talking about, so maybe it wasn’t that far off the mark.