Facial-Recognition Software Might Have a Racial Bias Problem
Depending on how algorithms are trained, they could be
significantly more accurate when identifying white faces than
African American ones.
Clare Garvie and Jonathan Frankle
Apr 7, 2016
In 16 “undisclosed locations” across northern Los Angeles, digital
eyes watch the public. These aren’t ordinary police-surveillance
cameras; these cameras are looking at your face. Using
facial-recognition software, the cameras can recognize individuals
from up to 600 feet away. The faces they collect are then compared,
in real-time, against “hot lists” of people suspected of gang
activity or having an open arrest warrant.
[…]
Continua qui:
http://www.theatlantic.com/technology/archive/2016/04/the-underlying-bias-of-facial-recognition-systems/476991/