{"id":31805,"date":"2021-06-20T13:28:06","date_gmt":"2021-06-20T20:28:06","guid":{"rendered":"https:\/\/cww7news.com\/?p=31805"},"modified":"2021-06-20T13:28:09","modified_gmt":"2021-06-20T20:28:09","slug":"google-pursues-tech-to-measure-how-white-you-are","status":"publish","type":"post","link":"https:\/\/cww7news.com\/google-pursues-tech-to-measure-how-white-you-are\/","title":{"rendered":"Google pursues tech to measure how white you are"},"content":{"rendered":"\n

The new project,\u00a0as described to Reuters<\/a>, is purportedly trying to improve facial recognition software and determine if many of the programs available on the market today are \u201cbiased\u201d against persons of color. If you find yourself thinking that a software program is incapable of secretly harboring a racist agenda, I would definitely agree with you. But that doesn\u2019t mean that the algorithms can\u2019t be seriously flawed and produce spectacularly incorrect results when broken down along racial lines. Assuming they manage to pull this off, however, how long do you think it will be before this new technology is unleashed in the field of \u201cracial justice\u201d applications?<\/p>\n\n\n\n

“Google told Reuters this week it is developing an alternative to the industry standard method for classifying skin tones, which a growing chorus of technology researchers and dermatologists says is inadequate for assessing whether products are biased against people of color.<\/p>\n\n\n\n

At issue is a six-color scale known as Fitzpatrick Skin Type (FST), which dermatologists have used since the 1970s. Tech companies now rely on it to categorize people and measure whether products such as facial recognition systems or smartwatch heart-rate sensors perform equally well across skin tones. read more<\/p>\n\n\n\n

Critics say FST, which includes four categories for \u201cwhite\u201d skin and one apiece for \u201cblack\u201d and \u201cbrown,\u201d disregards diversity among people of color.”<\/p>\n\n\n\n

In the past, we\u2019ve examined some of the glaring flaws in facial recognition software in the early years of its development. Amazon\u2019s Rekognition software has been hilariously bad at its job in terms of racial benchmarks. The first release, when undergoing public testing, was able to identify white males correctly 100% of the time, with the success rate for Hispanic males still being over 90%. But it couldn\u2019t pick out white females in a larger number of cases, misidentifying them as males 7% of the time. When asked to identify Black females, the success rate was well below half and in almost one-third of examples, it identified them as men.<\/p>\n\n\n\n

This led to some results that were\u00a0both amusing and disturbing<\/a>. When the ACLU tested the software by scanning the images of all of California\u2019s legislators and comparing them to a database of tens of thousands of mugshots,\u00a0it identified<\/a>\u00a0more than two dozen of the (mostly non-white) elected officials as criminals. Of course, this is California we\u2019re talking about, so maybe it wasn\u2019t that far off the mark.<\/p>\n","protected":false},"excerpt":{"rendered":"

The new project,\u00a0as described to Reuters, is purportedly trying to improve facial recognition software and determine if many of the programs available on the market today are \u201cbiased\u201d against persons of color. If you find yourself thinking that a software program is incapable of secretly harboring a racist agenda, I would definitely agree with you. […]<\/p>\n","protected":false},"author":754,"featured_media":31806,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"jetpack_post_was_ever_published":false,"_jetpack_newsletter_access":"","_jetpack_dont_email_post_to_subs":false,"_jetpack_newsletter_tier_id":0,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":true,"jetpack_social_options":{"image_generator_settings":{"template":"highway","enabled":false}}},"categories":[306],"tags":[],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"https:\/\/i0.wp.com\/cww7news.com\/wp-content\/uploads\/2021\/06\/gg.png?fit=700%2C370&ssl=1","jetpack_likes_enabled":true,"jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/cww7news.com\/wp-json\/wp\/v2\/posts\/31805"}],"collection":[{"href":"https:\/\/cww7news.com\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/cww7news.com\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/cww7news.com\/wp-json\/wp\/v2\/users\/754"}],"replies":[{"embeddable":true,"href":"https:\/\/cww7news.com\/wp-json\/wp\/v2\/comments?post=31805"}],"version-history":[{"count":1,"href":"https:\/\/cww7news.com\/wp-json\/wp\/v2\/posts\/31805\/revisions"}],"predecessor-version":[{"id":31807,"href":"https:\/\/cww7news.com\/wp-json\/wp\/v2\/posts\/31805\/revisions\/31807"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/cww7news.com\/wp-json\/wp\/v2\/media\/31806"}],"wp:attachment":[{"href":"https:\/\/cww7news.com\/wp-json\/wp\/v2\/media?parent=31805"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/cww7news.com\/wp-json\/wp\/v2\/categories?post=31805"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/cww7news.com\/wp-json\/wp\/v2\/tags?post=31805"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}