A Google service that automatically labels images produced starkly different results depending on skin tone on a given image. The company fixed the issue, but the problem is likely much broader. In the fight against the novel coronavirus, many countries ordered that citizens have their temperature checked at train stations or airports. The device needed in such situations, a hand-held thermometer, has risen from a specialist item to a common sight. A branch of Artificial Intelligence known as “computer vision” focuses on automated image labeling. Most computer vision systems were trained on data sets that contained very few images of hand-held thermometers. As a result, they cannot label the device correctly. In an experiment that became viral on Twitter, AlgorithmWatch showed that Google Vision Cloud, a computer vision service, labeled an image of a dark-skinned individual holding a thermometer “gun” while a similar image with a light-skinned individual was labeled “electronic device”. A subsequent experiment showed that the image of a dark-skinned hand holding a thermometer was labelled “gun” and that the same image with a salmon-colored overlay on the hand was enough for the computer to label it “monocular”. Continua su https://algorithmwatch.org/en/story/google-vision-racism/ Giacomo