There is a blind spot in AI research
There is a blind spot in AI research -- Kate Crawford & Ryan Calo Fears about the future impacts of artificial intelligence are distracting researchers from the real risks of deployed systems <http://www.nature.com/news/there-is-a-blind-spot-in-ai-research-1.20805> [] Autonomous systems are already deployed in our most crucial social institutions, from hospitals to courtrooms. Yet there are no agreed methods to assess the sustained effects of such applications on human populations. Recent years have brought extraordinary advances in the technical domains of AI <http://www.nature.com/news/computer-science-the-learning-machines-1.14481>. Alongside such efforts, designers and researchers from a range of disciplines need to conduct what we call social-systems analyses of AI. They need to assess the impact of technologies on their social, cultural and political settings. A social-systems approach could investigate, for instance, how the app AiCure — which tracks patients’ adherence to taking prescribed medication and transmits records to physicians — is changing the doctor–patient relationship. Such an approach could also explore whether the use of historical data to predict where crimes will happen is driving overpolicing of marginalized communities. Or it could investigate why high-rolling investors are given the right to understand the financial decisions made on their behalf by humans and algorithms, whereas low-income loan seekers are often left to wonder why their requests have been rejected. [] Alberto
participants (1)
-
Alberto Cammozzo