SKIN-SIGHTS

Google is tracking skin tones of people to make its search results ‘more inclusive’ after tech racism fears

GOOGLE has plans to introduce a new scale for measuring skin tones in an attempt to eliminate AI bias.

After partnering with Harvard professor Ellis Monk, Google is promoting a new way of identifying skin tones within its products and services.

Advertisement
.

Most notably, this research will help users when getting Search results and in Google's Photos app.

In Search, Google is implementing the MST scale to show results that are more inclusive of darker skin tones.

Advertisement

Most read in Tech

SO GOOG
Google exec shares 'magic' Android time-saver tip – it even helps you dress better
'APPY DAYS!
Major free WhatsApp upgrade makes voice note chats WAY easier when it's noisy
'BIO-DUCK'
Listen to unknown pulse deep in ocean that has baffled scientists for decades
NOTE FROM PAST
World’s oldest WRITING found in ancient tomb…but what it says is a mystery

For example, bridal makeup or hair-related searches will come with an algorithm that takes different skin tones into account so users can receive the most relevant results.

And, in Photos, the MST scale will provide a new set of “Real Tone filters" that are “designed to work well across skin tones” and “a wider assortment of looks," per Google.

In time, Google hopes to employ the scale throughout more of its products and services, calling it an "important next step in a collective effort to improve skin tone inclusivity in technology."

"For Google, it will help us make progress in our commitment to image equity and improving representation across our products. And in releasing the MST Scale for all to use, we hope to make it easier for others to do the same, so we can learn and evolve together."

Advertisement

The move by the tech giant follows a recent lawsuit in March that accused Google of bias against black employees.

This lawsuit is one of many that calls the tech industry's biases out – not only with regard to employment practices but in product development as well.

"Machines can discriminate in harmful ways. I experienced this firsthand when I was a graduate student at MIT in 2015 and discovered that some facial analysis software couldn’t detect my dark-skinned face until I put on a white mask," Joy Buolamwini wrote for TIME in 2019.

"These systems are often trained on images of predominantly light-skinned men," Buolamwini – who conducted research on large gender and racial bias in AI systems sold by tech giants like IBM, Microsoft, and Amazon – added.

Advertisement

"We often assume machines are neutral, but they aren’t."

Another example of this includes SkinVision, an AI-powered app that aims to detect cancer.

SkinVision's algorithm was challenged in 2021 as being more effective for lighter skin tones.

Advertisement
Topics
Advertisement
machibet777.com