Frank Pasquale - editoriale Los Angeles Times
Buondì à tutt*, per distrarsi (per modo di dire) dal triste contesto della Parigi di questi ultimi giorni, vi inoltro un articolo del collega Frank Pasquale che ha appena pubblicato per i tipi della Harvard University Press il suo attesissimo "The Black Box Society". --- WE'RE BEING STIGMATIZED BY 'BIG DATA' SCORES WE DON'T EVEN KNOW ABOUT By Frank Pasquale http://www.latimes.com/opinion/op-ed/la-oe-0116-pasquale-reputation-repair-d... Excerpt: "It's only complaints, investigations and leaks that give us occasional peeks into these black boxes of data mining. But what has emerged is terrifying. Data brokers can use public records — of marriage, divorce, home purchases or voting — to draw inferences about any of us. And they can sell their conclusions to anyone who wants it. Naturally, just as we've lost control of data, a plethora of new services are offering "credit repair" and "reputation optimization." But can they really help? Credit scoring algorithms are secret, so it's hard to know whether today's "fix" will be tomorrow's total fail. And no private company can save us from the thousands of other firms intent on mashing up whatever data is at hand to score and pigeonhole us. New approaches are needed. What might those look like? I take some inspiration from a Virginia law that bars auto insurers from requiring their customers to release event-recorder data from their cars, or from raising their rates if they refuse. That is forward-thinking regulation that is getting ahead of algorithmic monitoring, rather than belatedly reacting to it. Some states have banned employers from demanding their employees' Facebook passwords. They could go further, requiring employers to share with applicants and current workers the types of outside intelligence they use when making decisions about them. In general, we need what technology law and policy researcher Meg Leta Jones calls "fair automation practices" to complement the "fair data practices" that President Obama is proposing. We can't hope to prevent the collection or creation of inappropriate or inaccurate databases. But we can ensure that the use of that data by employers, insurers and other decision makers is made clear to us when we are affected by it. Without such notification, we may be stigmatized by secret digital judgments." --- Bien à vous, -- Antonio A. Casilli Associate Professor, Telecom ParisTech Research Fellow Edgar-Morin Center (EHESS)
Sul libro "Black Box Society" il punto di vista (interessante) di un ingegnere software, già a Google: http://www.slate.com/articles/technology/bitwise/2015/01/black_box_society_b... 2015-01-17 16:05 GMT+01:00 Antonio Casilli <antonio.casilli@telecom-paristech.fr>:
Buondì à tutt*,
per distrarsi (per modo di dire) dal triste contesto della Parigi di questi ultimi giorni, vi inoltro un articolo del collega Frank Pasquale che ha appena pubblicato per i tipi della Harvard University Press il suo attesissimo "The Black Box Society".
--- WE'RE BEING STIGMATIZED BY 'BIG DATA' SCORES WE DON'T EVEN KNOW ABOUT By Frank Pasquale http://www.latimes.com/opinion/op-ed/la-oe-0116-pasquale-reputation-repair-d...
Excerpt: "It's only complaints, investigations and leaks that give us occasional peeks into these black boxes of data mining. But what has emerged is terrifying. Data brokers can use public records — of marriage, divorce, home purchases or voting — to draw inferences about any of us. And they can sell their conclusions to anyone who wants it.
Naturally, just as we've lost control of data, a plethora of new services are offering "credit repair" and "reputation optimization." But can they really help? Credit scoring algorithms are secret, so it's hard to know whether today's "fix" will be tomorrow's total fail. And no private company can save us from the thousands of other firms intent on mashing up whatever data is at hand to score and pigeonhole us. New approaches are needed.
What might those look like? I take some inspiration from a Virginia law that bars auto insurers from requiring their customers to release event-recorder data from their cars, or from raising their rates if they refuse. That is forward-thinking regulation that is getting ahead of algorithmic monitoring, rather than belatedly reacting to it. Some states have banned employers from demanding their employees' Facebook passwords. They could go further, requiring employers to share with applicants and current workers the types of outside intelligence they use when making decisions about them.
In general, we need what technology law and policy researcher Meg Leta Jones calls "fair automation practices" to complement the "fair data practices" that President Obama is proposing. We can't hope to prevent the collection or creation of inappropriate or inaccurate databases. But we can ensure that the use of that data by employers, insurers and other decision makers is made clear to us when we are affected by it. Without such notification, we may be stigmatized by secret digital judgments."
---
Bien à vous,
-- Antonio A. Casilli Associate Professor, Telecom ParisTech Research Fellow Edgar-Morin Center (EHESS) _______________________________________________ nexa mailing list nexa@server-nexa.polito.it https://server-nexa.polito.it/cgi-bin/mailman/listinfo/nexa
participants (2)
-
Antonio Casilli -
Ugo Pagallo