Washington Passes Government Facial Recognition Rules
<https://www.govtech.com/policy/Washington-Passes-Government-Facial-Recogniti...> (TNS) — Washington state senators Wednesday approved a bill that would begin regulating the use of facial-recognition programs by local and state governments. Sponsored by Sen. Joe Nguyen, D-White Center, Senate Bill 6280 is one of a series of legislative proposals this year to counter technology that is evolving fast, regulated little and all but opaque to most residents. Facial recognition has been a particular concern, with worries that its use by law enforcement comes before the programs even can accurately A landmark federal study released in December found that facial-recognition programs were misidentifying people of color more often than white people; women more often than men; and children and elderly people more often than people in other age ranges. The bill now goes to the House for consideration. Among other things, SB 6280 prohibits state and local government agencies from using facial recognition for ongoing surveillance in most instances. That surveillance would be allowed in support of law enforcement with a search warrant or an agency director’s determination under some conditions, such as an emergency that involves risk of death. The legislation requires that any decisions made based on facial-recognition programs that have a legal impact be reviewed by an agency worker with training on facial recognition who has authority to change the decision, according to a legislative analysis. Examples of such decisions include the granting or denying of financial loans, housing, health care or employment opportunities. SB 6280 requires programs that have a legal impact to be tested by governments before being deployed. It sets training standards for government employees handling personal data gleaned from facial recognition. The bill also requires governments to issue annual reports disclosing how they use facial recognition and to hold community meetings on the reports. But Nguyen’s proposal — like another bill championed by Senate lawmakers to regulate how companies use personal data and facial recognition — could face resistance in the House. There, some lawmakers have instead wanted to temporarily stop governments from using facial recognition. Rep. Debra Entenman, D-Kent, sponsored one such proposal. House Bill 2856 would prohibit facial-recognition programs by local and state government until July 1, 2023. The bill passed a committee vote earlier this month but did not get a vote of the full House by a key deadline Wednesday. Entenman described the debate over facial recognition as a question of fundamental privacy rights, and “about having a technology that is not ready to be used in the public sphere.” Additionally, “As an African American woman, I am of course concerned about the fact that law enforcement and others believe that this technology will make people safer,” she said. Rep. Matt Boehnke, R-Kennewick said he would like to see something between Nguyen’s proposal and Entenman’s bill. Boehnke — who worked on digital-data issues while in the U.S. Army — suggested a shorter moratorium of about a year so lawmakers to assess how facial recognition is being used by governments. “These are critical issues,” said Boehnke, assistant ranking Republican on the House Innovation, Technology & Economic Development Committee. “And we need to start seeing what’s going on.” In a speech before the Senate passed his bill 30 to 18, Nguyen called it necessary to “implement strong moral guardrails” for facial-recognition programs. “Right now, facial-recognition technology is being used unchecked and with little recourse,” Nguyen said. “And tech companies generally don’t care about the moral values of the products they are creating.”
interessante sta imponendo limiti anche ai privati. il problema con la persona che puo' cambiare decisione e' che non lo fara' perche' i suoi incentivi sono a confermare la predizione del sistema, anche se non e' d'accordo. sono sempre piu' convinto che servira' qualche forma di prediction poisoning https://5ta.it/cm/d5f6 ciao, s. On 21/02/2020 10:27, Alberto Cammozzo wrote:
<https://www.govtech.com/policy/Washington-Passes-Government-Facial-Recogniti...>
(TNS) — Washington state senators Wednesday approved a bill that would begin regulating the use of facial-recognition programs by local and state governments.
Sponsored by Sen. Joe Nguyen, D-White Center, Senate Bill 6280 is one of a series of legislative proposals this year to counter technology that is evolving fast, regulated little and all but opaque to most residents.
Facial recognition has been a particular concern, with worries that its use by law enforcement comes before the programs even can accurately
A landmark federal study released in December found that facial-recognition programs were misidentifying people of color more often than white people; women more often than men; and children and elderly people more often than people in other age ranges.
The bill now goes to the House for consideration.
Among other things, SB 6280 prohibits state and local government agencies from using facial recognition for ongoing surveillance in most instances.
That surveillance would be allowed in support of law enforcement with a search warrant or an agency director’s determination under some conditions, such as an emergency that involves risk of death.
The legislation requires that any decisions made based on facial-recognition programs that have a legal impact be reviewed by an agency worker with training on facial recognition who has authority to change the decision, according to a legislative analysis. Examples of such decisions include the granting or denying of financial loans, housing, health care or employment opportunities.
SB 6280 requires programs that have a legal impact to be tested by governments before being deployed. It sets training standards for government employees handling personal data gleaned from facial recognition.
The bill also requires governments to issue annual reports disclosing how they use facial recognition and to hold community meetings on the reports.
But Nguyen’s proposal — like another bill championed by Senate lawmakers to regulate how companies use personal data and facial recognition — could face resistance in the House.
There, some lawmakers have instead wanted to temporarily stop governments from using facial recognition.
Rep. Debra Entenman, D-Kent, sponsored one such proposal. House Bill 2856 would prohibit facial-recognition programs by local and state government until July 1, 2023.
The bill passed a committee vote earlier this month but did not get a vote of the full House by a key deadline Wednesday.
Entenman described the debate over facial recognition as a question of fundamental privacy rights, and “about having a technology that is not ready to be used in the public sphere.”
Additionally, “As an African American woman, I am of course concerned about the fact that law enforcement and others believe that this technology will make people safer,” she said.
Rep. Matt Boehnke, R-Kennewick said he would like to see something between Nguyen’s proposal and Entenman’s bill.
Boehnke — who worked on digital-data issues while in the U.S. Army — suggested a shorter moratorium of about a year so lawmakers to assess how facial recognition is being used by governments.
“These are critical issues,” said Boehnke, assistant ranking Republican on the House Innovation, Technology & Economic Development Committee. “And we need to start seeing what’s going on.”
In a speech before the Senate passed his bill 30 to 18, Nguyen called it necessary to “implement strong moral guardrails” for facial-recognition programs.
“Right now, facial-recognition technology is being used unchecked and with little recourse,” Nguyen said. “And tech companies generally don’t care about the moral values of the products they are creating.” _______________________________________________ nexa mailing list nexa@server-nexa.polito.it https://server-nexa.polito.it/cgi-bin/mailman/listinfo/nexa
-- reserve your meeting with me at https://cal.quintarelli.it
On 21/02/2020 10:41, Stefano Quintarelli wrote:
interessante sta imponendo limiti anche ai privati.
il problema con la persona che puo' cambiare decisione e' che non lo fara' perche' i suoi incentivi sono a confermare la predizione del sistema, anche se non e' d'accordo.
sono sempre piu' convinto che servira' qualche forma di prediction poisoning https://5ta.it/cm/d5f6
Interessante la parte sugli incentivi e sono d'accordo: l'asimmetria informativa genera selezione avversa (Akerlof, George A., 1970,. "The Market for 'Lemons': Quality Uncertainty and the Market Mechanism". /Quarterly Journal of Economics/. The MIT Press. *84*) Sul poisoning sono perplesso: può funzionare, ma chi si prende la responsabilità del poisoning, cioè di aver deliberatamente fatto sbagliare la macchina per rendere l'umano attento? Imho, per via dell'automation paradox, "human-over-the-loop" non può funzionare. O è dentro o è fuori. Se è dentro decide, se è fuori non serve. Se decide, i motivi per la decisione devono essere interni all'umano perchè possa risponderne, con dispositivi di supporto alla decisione esterni, al caso, del cui output risponde chi li produce. ciao, A.
ciao, s.
On 21/02/2020 10:27, Alberto Cammozzo wrote:
<https://www.govtech.com/policy/Washington-Passes-Government-Facial-Recogniti...>
(TNS) — Washington state senators Wednesday approved a bill that would begin regulating the use of facial-recognition programs by local and state governments.
Sponsored by Sen. Joe Nguyen, D-White Center, Senate Bill 6280 is one of a series of legislative proposals this year to counter technology that is evolving fast, regulated little and all but opaque to most residents.
Facial recognition has been a particular concern, with worries that its use by law enforcement comes before the programs even can accurately
A landmark federal study released in December found that facial-recognition programs were misidentifying people of color more often than white people; women more often than men; and children and elderly people more often than people in other age ranges.
The bill now goes to the House for consideration.
Among other things, SB 6280 prohibits state and local government agencies from using facial recognition for ongoing surveillance in most instances.
That surveillance would be allowed in support of law enforcement with a search warrant or an agency director’s determination under some conditions, such as an emergency that involves risk of death.
The legislation requires that any decisions made based on facial-recognition programs that have a legal impact be reviewed by an agency worker with training on facial recognition who has authority to change the decision, according to a legislative analysis. Examples of such decisions include the granting or denying of financial loans, housing, health care or employment opportunities.
SB 6280 requires programs that have a legal impact to be tested by governments before being deployed. It sets training standards for government employees handling personal data gleaned from facial recognition.
The bill also requires governments to issue annual reports disclosing how they use facial recognition and to hold community meetings on the reports.
But Nguyen’s proposal — like another bill championed by Senate lawmakers to regulate how companies use personal data and facial recognition — could face resistance in the House.
There, some lawmakers have instead wanted to temporarily stop governments from using facial recognition.
Rep. Debra Entenman, D-Kent, sponsored one such proposal. House Bill 2856 would prohibit facial-recognition programs by local and state government until July 1, 2023.
The bill passed a committee vote earlier this month but did not get a vote of the full House by a key deadline Wednesday.
Entenman described the debate over facial recognition as a question of fundamental privacy rights, and “about having a technology that is not ready to be used in the public sphere.”
Additionally, “As an African American woman, I am of course concerned about the fact that law enforcement and others believe that this technology will make people safer,” she said.
Rep. Matt Boehnke, R-Kennewick said he would like to see something between Nguyen’s proposal and Entenman’s bill.
Boehnke — who worked on digital-data issues while in the U.S. Army — suggested a shorter moratorium of about a year so lawmakers to assess how facial recognition is being used by governments.
“These are critical issues,” said Boehnke, assistant ranking Republican on the House Innovation, Technology & Economic Development Committee. “And we need to start seeing what’s going on.”
In a speech before the Senate passed his bill 30 to 18, Nguyen called it necessary to “implement strong moral guardrails” for facial-recognition programs.
“Right now, facial-recognition technology is being used unchecked and with little recourse,” Nguyen said. “And tech companies generally don’t care about the moral values of the products they are creating.” _______________________________________________ nexa mailing list nexa@server-nexa.polito.it https://server-nexa.polito.it/cgi-bin/mailman/listinfo/nexa
On 21/02/2020 11:08, Alberto Cammozzo wrote:
On 21/02/2020 10:41, Stefano Quintarelli wrote:
interessante sta imponendo limiti anche ai privati.
il problema con la persona che puo' cambiare decisione e' che non lo fara' perche' i suoi incentivi sono a confermare la predizione del sistema, anche se non e' d'accordo.
sono sempre piu' convinto che servira' qualche forma di prediction poisoning https://5ta.it/cm/d5f6
Interessante la parte sugli incentivi e sono d'accordo: l'asimmetria informativa genera selezione avversa (Akerlof, George A., 1970,. "The Market for 'Lemons': Quality Uncertainty and the Market Mechanism". /Quarterly Journal of Economics/. The MIT Press. *84*)
Sul poisoning sono perplesso: può funzionare, ma chi si prende la responsabilità del poisoning, cioè di aver deliberatamente fatto sbagliare la macchina per rendere l'umano attento?
se leggi bene la mia proposta, non ci sono effetti collaterali negativi che possono emergere dal processo che delineo. e' un modo per invocare un third opinion.
Imho, per via dell'automation paradox, "human-over-the-loop" non può funzionare. O è dentro o è fuori. Se è dentro decide, se è fuori non serve.
se e' dentro, in alcuni use case decide come la macchina, perche' i suoi incentivi vanno in quella direzione
Se decide, i motivi per la decisione devono essere interni all'umano perchè possa risponderne, con dispositivi di supporto alla decisione esterni, al caso, del cui output risponde chi li produce.
sarebbe bello, ma temo che sia idealistico... ciao, s.
ciao, A.
ciao, s.
On 21/02/2020 10:27, Alberto Cammozzo wrote:
<https://www.govtech.com/policy/Washington-Passes-Government-Facial-Recogniti...>
(TNS) — Washington state senators Wednesday approved a bill that would begin regulating the use of facial-recognition programs by local and state governments.
Sponsored by Sen. Joe Nguyen, D-White Center, Senate Bill 6280 is one of a series of legislative proposals this year to counter technology that is evolving fast, regulated little and all but opaque to most residents.
Facial recognition has been a particular concern, with worries that its use by law enforcement comes before the programs even can accurately
A landmark federal study released in December found that facial-recognition programs were misidentifying people of color more often than white people; women more often than men; and children and elderly people more often than people in other age ranges.
The bill now goes to the House for consideration.
Among other things, SB 6280 prohibits state and local government agencies from using facial recognition for ongoing surveillance in most instances.
That surveillance would be allowed in support of law enforcement with a search warrant or an agency director’s determination under some conditions, such as an emergency that involves risk of death.
The legislation requires that any decisions made based on facial-recognition programs that have a legal impact be reviewed by an agency worker with training on facial recognition who has authority to change the decision, according to a legislative analysis. Examples of such decisions include the granting or denying of financial loans, housing, health care or employment opportunities.
SB 6280 requires programs that have a legal impact to be tested by governments before being deployed. It sets training standards for government employees handling personal data gleaned from facial recognition.
The bill also requires governments to issue annual reports disclosing how they use facial recognition and to hold community meetings on the reports.
But Nguyen’s proposal — like another bill championed by Senate lawmakers to regulate how companies use personal data and facial recognition — could face resistance in the House.
There, some lawmakers have instead wanted to temporarily stop governments from using facial recognition.
Rep. Debra Entenman, D-Kent, sponsored one such proposal. House Bill 2856 would prohibit facial-recognition programs by local and state government until July 1, 2023.
The bill passed a committee vote earlier this month but did not get a vote of the full House by a key deadline Wednesday.
Entenman described the debate over facial recognition as a question of fundamental privacy rights, and “about having a technology that is not ready to be used in the public sphere.”
Additionally, “As an African American woman, I am of course concerned about the fact that law enforcement and others believe that this technology will make people safer,” she said.
Rep. Matt Boehnke, R-Kennewick said he would like to see something between Nguyen’s proposal and Entenman’s bill.
Boehnke — who worked on digital-data issues while in the U.S. Army — suggested a shorter moratorium of about a year so lawmakers to assess how facial recognition is being used by governments.
“These are critical issues,” said Boehnke, assistant ranking Republican on the House Innovation, Technology & Economic Development Committee. “And we need to start seeing what’s going on.”
In a speech before the Senate passed his bill 30 to 18, Nguyen called it necessary to “implement strong moral guardrails” for facial-recognition programs.
“Right now, facial-recognition technology is being used unchecked and with little recourse,” Nguyen said. “And tech companies generally don’t care about the moral values of the products they are creating.” _______________________________________________ nexa mailing list nexa@server-nexa.polito.it https://server-nexa.polito.it/cgi-bin/mailman/listinfo/nexa
-- reserve your meeting with me at https://cal.quintarelli.it
On 21/02/2020 11:20, Stefano Quintarelli wrote:
On 21/02/2020 11:08, Alberto Cammozzo wrote:
On 21/02/2020 10:41, Stefano Quintarelli wrote:
interessante sta imponendo limiti anche ai privati.
il problema con la persona che puo' cambiare decisione e' che non lo fara' perche' i suoi incentivi sono a confermare la predizione del sistema, anche se non e' d'accordo.
sono sempre piu' convinto che servira' qualche forma di prediction poisoning https://5ta.it/cm/d5f6
Interessante la parte sugli incentivi e sono d'accordo: l'asimmetria informativa genera selezione avversa (Akerlof, George A., 1970,. "The Market for 'Lemons': Quality Uncertainty and the Market Mechanism". /Quarterly Journal of Economics/. The MIT Press. *84*)
Sul poisoning sono perplesso: può funzionare, ma chi si prende la responsabilità del poisoning, cioè di aver deliberatamente fatto sbagliare la macchina per rendere l'umano attento?
se leggi bene la mia proposta, non ci sono effetti collaterali negativi che possono emergere dal processo che delineo.
e' un modo per invocare un third opinion.
Bisogna che ci sia il tempo per farlo. La famiglia denuncia l'ospedale perché nel tempo perso per prendere la prima decisione (poisoned e discorde) e poi la seconda si poteva salvare il paziente che nel frattempo è morto o ha subito danni(ricordo che la patologia è acuta). Il medico si difende e protesta dicendo che lui aveva preso la scelta corretta ma il sistema gli ha impedito si metterla in atto, il produttore del sistema si difende dicendo che la macchina ha tenuto il comportamento nominale, ma l'ospedale rischia comunque di dover pagare: ha ostacolato il comportamento altrimenti corretto del medico. Per cui la prossima volta non compra la macchina. E' il problema degli automatismi che sostituiscono i piloti aerei, che sono allenati per rispondere all'emergenza determinata dal fallimento dei sistemi automatici e prendere il controllo, ma che però non riescono a farlo in tempo perché fino a un attimo prima erano off-the-loop. E se c'è tutto il tempo per chiedere un altro parere su un caso dubbio, perché non farlo comunque? Altra proposta: il medico può consultare il sistema automatico o scegliere di non farlo, e non è tenuto a dichiarare questa scelta in qualsiasi caso, perché comunque risponde del suo operato. Ciao, A
Imho, per via dell'automation paradox, "human-over-the-loop" non può funzionare. O è dentro o è fuori. Se è dentro decide, se è fuori non serve.
se e' dentro, in alcuni use case decide come la macchina, perche' i suoi incentivi vanno in quella direzione
Se decide, i motivi per la decisione devono essere interni all'umano perchè possa risponderne, con dispositivi di supporto alla decisione esterni, al caso, del cui output risponde chi li produce.
sarebbe bello, ma temo che sia idealistico...
ciao, s.
ciao, A.
ciao, s.
On 21/02/2020 10:27, Alberto Cammozzo wrote:
<https://www.govtech.com/policy/Washington-Passes-Government-Facial-Recogniti...>
(TNS) — Washington state senators Wednesday approved a bill that would begin regulating the use of facial-recognition programs by local and state governments.
Sponsored by Sen. Joe Nguyen, D-White Center, Senate Bill 6280 is one of a series of legislative proposals this year to counter technology that is evolving fast, regulated little and all but opaque to most residents.
Facial recognition has been a particular concern, with worries that its use by law enforcement comes before the programs even can accurately
A landmark federal study released in December found that facial-recognition programs were misidentifying people of color more often than white people; women more often than men; and children and elderly people more often than people in other age ranges.
The bill now goes to the House for consideration.
Among other things, SB 6280 prohibits state and local government agencies from using facial recognition for ongoing surveillance in most instances.
That surveillance would be allowed in support of law enforcement with a search warrant or an agency director’s determination under some conditions, such as an emergency that involves risk of death.
The legislation requires that any decisions made based on facial-recognition programs that have a legal impact be reviewed by an agency worker with training on facial recognition who has authority to change the decision, according to a legislative analysis. Examples of such decisions include the granting or denying of financial loans, housing, health care or employment opportunities.
SB 6280 requires programs that have a legal impact to be tested by governments before being deployed. It sets training standards for government employees handling personal data gleaned from facial recognition.
The bill also requires governments to issue annual reports disclosing how they use facial recognition and to hold community meetings on the reports.
But Nguyen’s proposal — like another bill championed by Senate lawmakers to regulate how companies use personal data and facial recognition — could face resistance in the House.
There, some lawmakers have instead wanted to temporarily stop governments from using facial recognition.
Rep. Debra Entenman, D-Kent, sponsored one such proposal. House Bill 2856 would prohibit facial-recognition programs by local and state government until July 1, 2023.
The bill passed a committee vote earlier this month but did not get a vote of the full House by a key deadline Wednesday.
Entenman described the debate over facial recognition as a question of fundamental privacy rights, and “about having a technology that is not ready to be used in the public sphere.”
Additionally, “As an African American woman, I am of course concerned about the fact that law enforcement and others believe that this technology will make people safer,” she said.
Rep. Matt Boehnke, R-Kennewick said he would like to see something between Nguyen’s proposal and Entenman’s bill.
Boehnke — who worked on digital-data issues while in the U.S. Army — suggested a shorter moratorium of about a year so lawmakers to assess how facial recognition is being used by governments.
“These are critical issues,” said Boehnke, assistant ranking Republican on the House Innovation, Technology & Economic Development Committee. “And we need to start seeing what’s going on.”
In a speech before the Senate passed his bill 30 to 18, Nguyen called it necessary to “implement strong moral guardrails” for facial-recognition programs.
“Right now, facial-recognition technology is being used unchecked and with little recourse,” Nguyen said. “And tech companies generally don’t care about the moral values of the products they are creating.” _______________________________________________ nexa mailing list nexa@server-nexa.polito.it https://server-nexa.polito.it/cgi-bin/mailman/listinfo/nexa
e' un modo per invocare un third opinion.
Bisogna che ci sia il tempo per farlo.
il rinvio alla third opinion e' realtime. la third opinion in casi rilevanti non puo' essere considerata un ritardo...
E se c'è tutto il tempo per chiedere un altro parere su un caso dubbio, perché non farlo comunque?
perche' gli incentivi vanno in direzione opposta.
Altra proposta: il medico può consultare il sistema automatico o scegliere di non farlo, e non è tenuto a dichiarare questa scelta in qualsiasi caso, perché comunque risponde del suo operato.
l'incentivo e' consultarlo sempre e confermare la predizione,. e' cio' che accade con i tecnici di laboratorio.
Ciao,
A
Imho, per via dell'automation paradox, "human-over-the-loop" non può funzionare. O è dentro o è fuori. Se è dentro decide, se è fuori non serve.
se e' dentro, in alcuni use case decide come la macchina, perche' i suoi incentivi vanno in quella direzione
Se decide, i motivi per la decisione devono essere interni all'umano perchè possa risponderne, con dispositivi di supporto alla decisione esterni, al caso, del cui output risponde chi li produce.
sarebbe bello, ma temo che sia idealistico...
ciao, s.
ciao, A.
ciao, s.
On 21/02/2020 10:27, Alberto Cammozzo wrote:
<https://www.govtech.com/policy/Washington-Passes-Government-Facial-Recogniti...>
(TNS) — Washington state senators Wednesday approved a bill that would begin regulating the use of facial-recognition programs by local and state governments.
Sponsored by Sen. Joe Nguyen, D-White Center, Senate Bill 6280 is one of a series of legislative proposals this year to counter technology that is evolving fast, regulated little and all but opaque to most residents.
Facial recognition has been a particular concern, with worries that its use by law enforcement comes before the programs even can accurately
A landmark federal study released in December found that facial-recognition programs were misidentifying people of color more often than white people; women more often than men; and children and elderly people more often than people in other age ranges.
The bill now goes to the House for consideration.
Among other things, SB 6280 prohibits state and local government agencies from using facial recognition for ongoing surveillance in most instances.
That surveillance would be allowed in support of law enforcement with a search warrant or an agency director’s determination under some conditions, such as an emergency that involves risk of death.
The legislation requires that any decisions made based on facial-recognition programs that have a legal impact be reviewed by an agency worker with training on facial recognition who has authority to change the decision, according to a legislative analysis. Examples of such decisions include the granting or denying of financial loans, housing, health care or employment opportunities.
SB 6280 requires programs that have a legal impact to be tested by governments before being deployed. It sets training standards for government employees handling personal data gleaned from facial recognition.
The bill also requires governments to issue annual reports disclosing how they use facial recognition and to hold community meetings on the reports.
But Nguyen’s proposal — like another bill championed by Senate lawmakers to regulate how companies use personal data and facial recognition — could face resistance in the House.
There, some lawmakers have instead wanted to temporarily stop governments from using facial recognition.
Rep. Debra Entenman, D-Kent, sponsored one such proposal. House Bill 2856 would prohibit facial-recognition programs by local and state government until July 1, 2023.
The bill passed a committee vote earlier this month but did not get a vote of the full House by a key deadline Wednesday.
Entenman described the debate over facial recognition as a question of fundamental privacy rights, and “about having a technology that is not ready to be used in the public sphere.”
Additionally, “As an African American woman, I am of course concerned about the fact that law enforcement and others believe that this technology will make people safer,” she said.
Rep. Matt Boehnke, R-Kennewick said he would like to see something between Nguyen’s proposal and Entenman’s bill.
Boehnke — who worked on digital-data issues while in the U.S. Army — suggested a shorter moratorium of about a year so lawmakers to assess how facial recognition is being used by governments.
“These are critical issues,” said Boehnke, assistant ranking Republican on the House Innovation, Technology & Economic Development Committee. “And we need to start seeing what’s going on.”
In a speech before the Senate passed his bill 30 to 18, Nguyen called it necessary to “implement strong moral guardrails” for facial-recognition programs.
“Right now, facial-recognition technology is being used unchecked and with little recourse,” Nguyen said. “And tech companies generally don’t care about the moral values of the products they are creating.” _______________________________________________ nexa mailing list nexa@server-nexa.polito.it https://server-nexa.polito.it/cgi-bin/mailman/listinfo/nexa
-- reserve your meeting with me at https://cal.quintarelli.it
participants (2)
-
Alberto Cammozzo -
Stefano Quintarelli