monitoraggio delle emozioni con AI nei servizi di Videocomunicazione (zoom etc) e rischi
buongiorno da tanto non scrivo sulla lista, scrivo su questo tema per i sistemi di comunicazione dipendiamo da soluzioni che non sono soggette ai regolamenti dei sistemi di telefonia degli operatori, come noto, ma sono considerate appunto "applicazioni" e non "servizi di comunicazione" che sarebbero molto più controllati questo significa che sono utilizzate e in sviluppo soluzioni come quelle descritte di seguito per monitorare emozioni, capacità di concentrazione, etc etc durante le numerose videocall che facciamo per lavoro, per studio, per amicizia semplicemente ... credo che il tema meriti rilevanza e riflessioni che solo ogni tanto emergono un caro saluto gabriele AI software from Uniphore, Sybill and Zoom detects customer emotions - Protocol<https://www.protocol.com/enterprise/emotion-ai-sales-virtual-zoom> Companies are using AI to monitor your mood during sales calls. Zoom might be next. Software-makers claim that AI can help sellers not only communicate better, but detect the “emotional state” of a deal — and the people they’re selling to. Virtual sales meetings have made it tougher than ever for salespeople to read the room. | Illustration: Christopher T. Fong/Protocol <https://www.protocol.com/u/katekaye> Kate Kaye<https://www.protocol.com/u/katekaye> April 13, 2022 Virtual sales meetings have made it tougher than ever for salespeople to read the room. So, some well funded tech providers are stepping in with a bold sales pitch of their own: that AI can not only help sellers communicate better, but detect the “emotional state” of a deal — and the people they’re selling to. In fact, while AI researchers have attempted to instill human emotion into otherwise cold and calculating robotic machines for decades, sales and customer service software companies including Uniphore and Sybill are building products that use AI in an attempt to help humans understand and respond to human emotion. Virtual meeting powerhouse Zoom<https://www.protocol.com/zoom-videoconferencing-history-profit> also plans to provide similar features in the future. “It’s very hard to build rapport in a relationship in that type of environment,” said Tim Harris, director of Product Marketing at Uniphore, regarding virtual meetings. The company sells software that attempts to detect whether a potential customer is interested in what a salesperson has to say during a video call, alerting the salesperson in real time during the meeting if someone seems more or less engaged in a particular topic. The system, called Q for Sales, might indicate that a potential customer’s sentiment or engagement level perked up when a salesperson mentioned a particular product feature, but then drooped when the price was mentioned. Sybill, a competitor, also uses AI in an attempt to analyze people’s moods <https://www.protocol.com/enterprise/driver-monitoring-ai-infrastructure-bill> during a call. Uniphore’s software incorporates computer vision, speech recognition, natural-language processing and emotion AI to pick up on the behavioral cues associated with someone’s tone of voice, eye and facial movements or other non-verbal body language, then analyzes that data to assess their emotional attitude. And there’s an actual digital emotion scorecard. Sitting alongside someone’s image on camera during a virtual meeting, the Q for Sales application visualizes emotion through fluctuating gauges indicating detected levels of sentiment and engagement based on the system’s combined interpretation of their satisfaction, happiness, engagement, surprise, anger, disgust, fear or sadness. The software requires video calls to be recorded, and it is only able to assess someone’s sentiment when that individual customer — or room full of potential customers — and the salesperson have approved recording. Image: Uniphore Although Harris said Uniphore does not compile profiles of individual people based on the data it intercepts and generates, its software does provide data it says indicates the “emotional state of a deal” based on the sentiment and engagement of all members of a buying committee who have been present in meetings across the timeline of discussions with that potential customer. Always be … recording? But the mere request to record a virtual conversation can alter a customer’s attitude, said Grace Briscoe, senior vice president of Client Development at digital ad company Basis Technologies. “As soon as that recording alert comes up, it puts people on guard,” she said. “I think it would be off-putting for the clients; they would be less candid. I don’t think it would be conducive to the kind of relationship building that we want to do." While some sales meeting participants might be uncomfortable being recorded, others will be more open to it, said Josh Dulberger, head of Product, Data and AI at Zoom. “Part of it is the culture of the sales team,” he said, noting that recording might not be tolerated when selling to more sensitive industries such as financial services. Zoom, the king of virtual meetings, said Wednesday it is introducing new features called Zoom IQ for Sales that provide sales meeting hosts with post-meeting conversation transcriptions and sentiment analysis. Although some AI-based transcription services have been known to make mistakes, Dulberger said Zoom’s software was built in-house using its own automated speaker recognition and natural-language-understanding system. The system is integrated with Salesforce. “We’re looking at things like speaker cadence and other factors in the linguistic approach to try to disentangle one speaker from another,” Dulberger said. For now, the new Zoom features for salespeople do not assess sentiment in real time during a meeting. Instead, they deliver post-meeting analysis. For instance, Dulberger said an interaction might be labeled as “low engagement” if the potential customer did not speak much. “You will be able to measure that they weren’t very well engaged,” he said, noting that salespeople aim for balanced conversations during which customers talk as often as a sales rep. Frustration detected. Show empathy. Sentiment analysis is nothing new. Since the early days of social media, software providers have sucked up text from posts and tweets and product reviews, analyzing their content to help determine what they mean for consumer brands, restaurants or political candidates. Today, software for help desk chats and call centers employ voice recognition and natural-language-processing AI to prompt customer service reps to speak more slowly or be more energetic. For example, Amazon has partnered with Salesforce to bring sentiment analysis to apps used by customer service agents, and a product from Cogito uses in-call voice analysis to assess the emotional state of callers or service reps. “Frustration detected. Show empathy,” states an alert shown as an example on Cogito’s website. Questionable AI for coaching basic human skills But what companies such as Uniphore, which recently collected $400 million in series E funding at a valuation of $2.5 billion, and Sybill are doing goes further than customer service prompts. Uniphore and Sybill aim to monitor human behavior during video calls in real time. And they are betting that even seasoned salespeople can benefit from the guidance of their emotional AI coaching. Dulberger said Zoom also has active research underway to incorporate emotion AI into the company’s products in the future. He pointed to research he said shows that improvements are being made to AI used to detect people’s emotions, including a study<https://link.springer.com/content/pdf/10.1007/s42452-020-2234-1.pdf> involving a technique that removes facial images from background imagery that can confuse computers and a new data set that incorporated<https://www.nature.com/articles/s41597-022-01262-0> facial expression data, physiological signals such as heart rate and body temperature and self-reported emotions. “These are informational signals that can be useful; they’re not necessarily decisive,” Dulberger said, noting that metrics based on emotion AI could be added to provide salespeople with a richer understanding of what happened during a sales meeting, for instance by detecting, “We think sentiments went south in this part of the call.” Briscoe said she recognized the potential value of emotion-AI-based technologies as management tools to help determine which salespeople might be experiencing problems. However, she said, “Companies should hire people who have some level of emotional intelligence. If the people on our team cannot read that someone has lost interest, those are basic human skills that I don’t know why you’d need AI [to facilitate].” Image: Uniphore Even if emotional AI guidance is appealing to some sales teams, its validity is in question<https://www.nature.com/articles/d41586-020-00507-5>. “The claim that a person’s interior state can be accurately assessed by analyzing that person’s face is premised on shaky evidence,” wrote Kate Crawford in a 2021 article<https://www.theatlantic.com/technology/archive/2021/04/artificial-intelligen...> in The Atlantic. In the article, Crawford, an AI ethics scholar, research professor at USC Annenberg and a senior principal researcher at Microsoft Research, cited a 2019 research paper<https://journals.sagepub.com/doi/10.1177/1529100619832930> that stated, “The available scientific evidence suggests that people do sometimes smile when happy, frown when sad, scowl when angry, and so on, as proposed by the common view, more than what would be expected by chance. Yet how people communicate anger, disgust, fear, happiness, sadness, and surprise varies substantially across cultures, situations, and even across people within a single situation.” “We’re able to look at faces and classify them into different emotional expressions established by psychologists that are pretty standard out there,” said Patrick Ehlen, Uniphore’s vice president of AI. You could be smiling and nodding, and in fact, you’re thinking about your vacation next week. Ehlen said the technology Uniphore has developed uses the same signals people use to infer what others are thinking or feeling, such as facial expressions, body language and tone of voice. “We endeavor to do as well as a human,” he said. Uniphore’s software incorporates computer vision and human emotion analysis technology the company acquired when it purchased Emotion Research Labs in 2021 for an undisclosed price. Uniphore’s AI model was trained using open-source and private data sets featuring images of diverse ethnic groups of people, Ehlen said. Some of that data came from actual sales meetings the company held. To help the machine learn what facial cues represent certain types of emotions, the image data was labeled by people Uniphore hired to make those annotations based on a set of guidelines the company established then modified based on whether people agreed on certain criteria, he said. “Going forward there’s always room for these things to improve as the system gets in the hands of larger domains,” Ehlen said. The company is also conducting a validation study of the software. But Ehlen recognized the limitations of the technology. “There is no real objective way to measure people’s emotions,” he said. “You could be smiling and nodding, and in fact, you’re thinking about your vacation next week.” <https://www.protocol.com/u/katekaye> <https://www.protocol.com/u/katekaye> Kate Kaye<https://www.protocol.com/u/katekaye> Kate Kaye is an award-winning multimedia reporter digging deep and telling print, digital and audio stories. She covers AI and data for Protocol. Her reporting on AI and tech ethics issues has been published in OneZero, Fast Company, MIT Technology Review, CityLab, Ad Age and Digiday and heard on NPR. Kate is the creator of RedTailMedia.org and is the author of "Campaign '08: A Turning Point for Digital Media," a book about how the 2008 presidential campaigns used digital media and data Privacy groups urge Zoom to abandon emotion AI research (techtarget.com)<https://www.techtarget.com/searchunifiedcommunications/news/252518128/Privac...> Privacy groups urge Zoom to abandon emotion AI research Privacy organizations want Zoom to ditch emotion-tracking AI. The tech is invasive, discriminatory and doesn't work, the groups said. By * Mike Gleason,<https://www.techtarget.com/contributor/Mike-Gleason> News Writer Published: 12 May 2022 Multiple human rights organizations have asked Zoom to keep emotion-tracking AI out of its products, calling the technology discriminatory, reliant on pseudoscience and potentially dangerous. Almost 30 advocacy groups cosigned a letter<https://www.fightforthefuture.org/news/2022-05-11-letter-to-zoom/> to Zoom CEO Eric Yuan this week. They urged him to abandon research into emotion AI<https://www.techtarget.com/searchcio/feature/Emotion-AI-shows-promise-for-IT...>, which uses facial expressions and vocal cues to determine a user's state of mind. The letter's signatories, including the American Civil Liberties Union and the Electronic Privacy Information Center, said emotion AI's invasive nature, inaccuracy and potential for misuse would harm Zoom users. Zoom did not respond to a request for comment. Advocacy group Fight for the Future started the campaign in reaction to a Protocol story<https://www.protocol.com/enterprise/emotion-ai-sales-virtual-zoom> about Zoom's plans to incorporate emotion AI in products. The organization said Zoom has a responsibility as an industry leader to protect the public from such problematic tech. "[Zoom] can make it clear that this technology has no place in video communications," the letter read. Emotion AI's invasive monitoring<https://www.techtarget.com/searchunifiedcommunications/feature/Monitoring-pr...> would violate worker privacy and human rights, the groups said. A company, for example, could use the technology to review meetings and punish employees for expressing the wrong emotions. The efficacy and ethics<https://www.computerweekly.com/opinion/Can-we-rely-on-AI?> of emotion AI are controversial. Even if AI software correctly reads a person's expression, it may not accurately judge their feelings. In a 2019 study<https://www.psychologicalscience.org/publications/emotional-expressions-reco...>, Northeastern University professor Lisa Feldman Barrett and her colleagues found that facial expressions have limited reliability. For example, a scowl could indicate anger, confusion or concentration. Culture and individual circumstances may also affect how people express their feelings. Facial-recognition AI tools have struggled with race as well<https://www.techtarget.com/searchenterpriseai/feature/Combating-racial-bias-...>. A 2018 study<https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3281765> by University of Maryland professor Lauren Rhue discovered that facial-recognition software attributed negative emotions to Black people more often than white people. Commercial facial-recognition software also misidentified dark-skinned women almost 35% of the time<https://news.mit.edu/2018/study-finds-gender-skin-type-bias-artificial-intel...>, compared to an error rate of 0.8% for light-skinned men, according to an MIT and Stanford University paper. In their letter to Zoom, privacy advocates said features based on emotion AI would embed error-prone tools in the productivity software used by millions of people. Zoom has seized on AI to add value to video meetings. Last month, the company launched an AI sales tool<https://www.techtarget.com/searchunifiedcommunications/news/252515931/Zoom-l...> that analyzes video call transcripts, using data like the number of questions a customer asked to determine if they were engaged. The IQ for Sales feature is the first in a series of planned AI add-ons for Zoom.
Grazie per la condivisione. Questo e’ uno dei temi piu’urgenti del nostro tempo. Ricorderete il brevetto di Amazon con cui Alexa puo’ analizzare la tua voce per capire se tu stia bene. Mi piacerebbe tanto metter su / partecipare a un progetto collettivo e interdisciplinare sui profili etici, socio-giuridici e informatici della commodification of emotions. Cheers, Guido Noto La Diega
On 13 May 2022, at 11:05, gabriele elia <gabriele.elia@live.com> wrote:
buongiorno da tanto non scrivo sulla lista, scrivo su questo tema
per i sistemi di comunicazione dipendiamo da soluzioni che non sono soggette ai regolamenti dei sistemi di telefonia degli operatori, come noto, ma sono considerate appunto "applicazioni" e non "servizi di comunicazione" che sarebbero molto più controllati
questo significa che sono utilizzate e in sviluppo soluzioni come quelle descritte di seguito per monitorare emozioni, capacità di concentrazione, etc etc durante le numerose videocall che facciamo per lavoro, per studio, per amicizia semplicemente ...
credo che il tema meriti rilevanza e riflessioni che solo ogni tanto emergono
un caro saluto gabriele
AI software from Uniphore, Sybill and Zoom detects customer emotions - Protocol <https://www.protocol.com/enterprise/emotion-ai-sales-virtual-zoom>
Companies are using AI to monitor your mood during sales calls. Zoom might be next. Software-makers claim that AI can help sellers not only communicate better, but detect the “emotional state” of a deal — and the people they’re selling to.
Virtual sales meetings have made it tougher than ever for salespeople to read the room. | Illustration: Christopher T. Fong/Protocol <https://www.protocol.com/u/katekaye> Kate Kaye <https://www.protocol.com/u/katekaye> April 13, 2022 Virtual sales meetings have made it tougher than ever for salespeople to read the room. So, some well funded tech providers are stepping in with a bold sales pitch of their own: that AI can not only help sellers communicate better, but detect the “emotional state” of a deal — and the people they’re selling to. In fact, while AI researchers have attempted to instill human emotion into otherwise cold and calculating robotic machines for decades, sales and customer service software companies including Uniphore and Sybill are building products that use AI in an attempt to help humans understand and respond to human emotion. Virtual meeting powerhouse Zoom <https://www.protocol.com/zoom-videoconferencing-history-profit> also plans to provide similar features in the future. “It’s very hard to build rapport in a relationship in that type of environment,” said Tim Harris, director of Product Marketing at Uniphore, regarding virtual meetings. The company sells software that attempts to detect whether a potential customer is interested in what a salesperson has to say during a video call, alerting the salesperson in real time during the meeting if someone seems more or less engaged in a particular topic. The system, called Q for Sales, might indicate that a potential customer’s sentiment or engagement level perked up when a salesperson mentioned a particular product feature, but then drooped when the price was mentioned. Sybill, a competitor, also uses AI in an attempt to analyze people’s moods <https://www.protocol.com/enterprise/driver-monitoring-ai-infrastructure-bill>during a call. Uniphore’s software incorporates computer vision, speech recognition, natural-language processing and emotion AI to pick up on the behavioral cues associated with someone’s tone of voice, eye and facial movements or other non-verbal body language, then analyzes that data to assess their emotional attitude. And there’s an actual digital emotion scorecard. Sitting alongside someone’s image on camera during a virtual meeting, the Q for Sales application visualizes emotion through fluctuating gauges indicating detected levels of sentiment and engagement based on the system’s combined interpretation of their satisfaction, happiness, engagement, surprise, anger, disgust, fear or sadness. The software requires video calls to be recorded, and it is only able to assess someone’s sentiment when that individual customer — or room full of potential customers — and the salesperson have approved recording.
Image: Uniphore Although Harris said Uniphore does not compile profiles of individual people based on the data it intercepts and generates, its software does provide data it says indicates the “emotional state of a deal” based on the sentiment and engagement of all members of a buying committee who have been present in meetings across the timeline of discussions with that potential customer. Always be … recording? But the mere request to record a virtual conversation can alter a customer’s attitude, said Grace Briscoe, senior vice president of Client Development at digital ad company Basis Technologies. “As soon as that recording alert comes up, it puts people on guard,” she said. “I think it would be off-putting for the clients; they would be less candid. I don’t think it would be conducive to the kind of relationship building that we want to do." While some sales meeting participants might be uncomfortable being recorded, others will be more open to it, said Josh Dulberger, head of Product, Data and AI at Zoom. “Part of it is the culture of the sales team,” he said, noting that recording might not be tolerated when selling to more sensitive industries such as financial services. Zoom, the king of virtual meetings, said Wednesday it is introducing new features called Zoom IQ for Sales that provide sales meeting hosts with post-meeting conversation transcriptions and sentiment analysis. Although some AI-based transcription services have been known to make mistakes, Dulberger said Zoom’s software was built in-house using its own automated speaker recognition and natural-language-understanding system. The system is integrated with Salesforce. “We’re looking at things like speaker cadence and other factors in the linguistic approach to try to disentangle one speaker from another,” Dulberger said. For now, the new Zoom features for salespeople do not assess sentiment in real time during a meeting. Instead, they deliver post-meeting analysis. For instance, Dulberger said an interaction might be labeled as “low engagement” if the potential customer did not speak much. “You will be able to measure that they weren’t very well engaged,” he said, noting that salespeople aim for balanced conversations during which customers talk as often as a sales rep. Frustration detected. Show empathy. Sentiment analysis is nothing new. Since the early days of social media, software providers have sucked up text from posts and tweets and product reviews, analyzing their content to help determine what they mean for consumer brands, restaurants or political candidates. Today, software for help desk chats and call centers employ voice recognition and natural-language-processing AI to prompt customer service reps to speak more slowly or be more energetic. For example, Amazon has partnered with Salesforce to bring sentiment analysis to apps used by customer service agents, and a product from Cogito uses in-call voice analysis to assess the emotional state of callers or service reps. “Frustration detected. Show empathy,” states an alert shown as an example on Cogito’s website. Questionable AI for coaching basic human skills But what companies such as Uniphore, which recently collected $400 million in series E funding at a valuation of $2.5 billion, and Sybill are doing goes further than customer service prompts. Uniphore and Sybill aim to monitor human behavior during video calls in real time. And they are betting that even seasoned salespeople can benefit from the guidance of their emotional AI coaching. Dulberger said Zoom also has active research underway to incorporate emotion AI into the company’s products in the future. He pointed to research he said shows that improvements are being made to AI used to detect people’s emotions, including a study <https://link.springer.com/content/pdf/10.1007/s42452-020-2234-1.pdf> involving a technique that removes facial images from background imagery that can confuse computers and a new data set that incorporated <https://www.nature.com/articles/s41597-022-01262-0> facial expression data, physiological signals such as heart rate and body temperature and self-reported emotions. “These are informational signals that can be useful; they’re not necessarily decisive,” Dulberger said, noting that metrics based on emotion AI could be added to provide salespeople with a richer understanding of what happened during a sales meeting, for instance by detecting, “We think sentiments went south in this part of the call.” Briscoe said she recognized the potential value of emotion-AI-based technologies as management tools to help determine which salespeople might be experiencing problems. However, she said, “Companies should hire people who have some level of emotional intelligence. If the people on our team cannot read that someone has lost interest, those are basic human skills that I don’t know why you’d need AI [to facilitate].” Image: Uniphore Even if emotional AI guidance is appealing to some sales teams, its validity is in question <https://www.nature.com/articles/d41586-020-00507-5>. “The claim that a person’s interior state can be accurately assessed by analyzing that person’s face is premised on shaky evidence,” wrote Kate Crawford in a 2021 article <https://www.theatlantic.com/technology/archive/2021/04/artificial-intelligen...> in The Atlantic. In the article, Crawford, an AI ethics scholar, research professor at USC Annenberg and a senior principal researcher at Microsoft Research, cited a 2019 research paper <https://journals.sagepub.com/doi/10.1177/1529100619832930> that stated, “The available scientific evidence suggests that people do sometimes smile when happy, frown when sad, scowl when angry, and so on, as proposed by the common view, more than what would be expected by chance. Yet how people communicate anger, disgust, fear, happiness, sadness, and surprise varies substantially across cultures, situations, and even across people within a single situation.” “We’re able to look at faces and classify them into different emotional expressions established by psychologists that are pretty standard out there,” said Patrick Ehlen, Uniphore’s vice president of AI. You could be smiling and nodding, and in fact, you’re thinking about your vacation next week. Ehlen said the technology Uniphore has developed uses the same signals people use to infer what others are thinking or feeling, such as facial expressions, body language and tone of voice. “We endeavor to do as well as a human,” he said. Uniphore’s software incorporates computer vision and human emotion analysis technology the company acquired when it purchased Emotion Research Labs in 2021 for an undisclosed price. Uniphore’s AI model was trained using open-source and private data sets featuring images of diverse ethnic groups of people, Ehlen said. Some of that data came from actual sales meetings the company held. To help the machine learn what facial cues represent certain types of emotions, the image data was labeled by people Uniphore hired to make those annotations based on a set of guidelines the company established then modified based on whether people agreed on certain criteria, he said. “Going forward there’s always room for these things to improve as the system gets in the hands of larger domains,” Ehlen said. The company is also conducting a validation study of the software. But Ehlen recognized the limitations of the technology. “There is no real objective way to measure people’s emotions,” he said. “You could be smiling and nodding, and in fact, you’re thinking about your vacation next week.” <https://www.protocol.com/u/katekaye> <https://www.protocol.com/u/katekaye>Kate Kaye <https://www.protocol.com/u/katekaye> Kate Kaye is an award-winning multimedia reporter digging deep and telling print, digital and audio stories. She covers AI and data for Protocol. Her reporting on AI and tech ethics issues has been published in OneZero, Fast Company, MIT Technology Review, CityLab, Ad Age and Digiday and heard on NPR. Kate is the creator of RedTailMedia.org <http://redtailmedia.org/> and is the author of "Campaign '08: A Turning Point for Digital Media," a book about how the 2008 presidential campaigns used digital media and data
Privacy groups urge Zoom to abandon emotion AI research (techtarget.com) <https://www.techtarget.com/searchunifiedcommunications/news/252518128/Privac...>
Privacy groups urge Zoom to abandon emotion AI research
Privacy organizations want Zoom to ditch emotion-tracking AI. The tech is invasive, discriminatory and doesn't work, the groups said.
By Mike Gleason, <https://www.techtarget.com/contributor/Mike-Gleason> News Writer Published: 12 May 2022
Multiple human rights organizations have asked Zoom to keep emotion-tracking AI out of its products, calling the technology discriminatory, reliant on pseudoscience and potentially dangerous. Almost 30 advocacy groups cosigned a letter <https://www.fightforthefuture.org/news/2022-05-11-letter-to-zoom/> to Zoom CEO Eric Yuan this week. They urged him to abandon research into emotion AI <https://www.techtarget.com/searchcio/feature/Emotion-AI-shows-promise-for-IT...>, which uses facial expressions and vocal cues to determine a user's state of mind. The letter's signatories, including the American Civil Liberties Union and the Electronic Privacy Information Center, said emotion AI's invasive nature, inaccuracy and potential for misuse would harm Zoom users.
Zoom did not respond to a request for comment.
Advocacy group Fight for the Future started the campaign in reaction to a Protocol story <https://www.protocol.com/enterprise/emotion-ai-sales-virtual-zoom> about Zoom's plans to incorporate emotion AI in products. The organization said Zoom has a responsibility as an industry leader to protect the public from such problematic tech.
"[Zoom] can make it clear that this technology has no place in video communications," the letter read.
Emotion AI's invasive monitoring <https://www.techtarget.com/searchunifiedcommunications/feature/Monitoring-pr...> would violate worker privacy and human rights, the groups said. A company, for example, could use the technology to review meetings and punish employees for expressing the wrong emotions.
The efficacy and ethics <https://www.computerweekly.com/opinion/Can-we-rely-on-AI?> of emotion AI are controversial. Even if AI software correctly reads a person's expression, it may not accurately judge their feelings. In a 2019 study <https://www.psychologicalscience.org/publications/emotional-expressions-reco...>, Northeastern University professor Lisa Feldman Barrett and her colleagues found that facial expressions have limited reliability.
For example, a scowl could indicate anger, confusion or concentration. Culture and individual circumstances may also affect how people express their feelings.
Facial-recognition AI tools have struggled with race as well <https://www.techtarget.com/searchenterpriseai/feature/Combating-racial-bias-...>. A 2018 study <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3281765> by University of Maryland professor Lauren Rhue discovered that facial-recognition software attributed negative emotions to Black people more often than white people.
Commercial facial-recognition software also misidentified dark-skinned women almost 35% of the time <https://news.mit.edu/2018/study-finds-gender-skin-type-bias-artificial-intel...>, compared to an error rate of 0.8% for light-skinned men, according to an MIT and Stanford University paper. In their letter to Zoom, privacy advocates said features based on emotion AI would embed error-prone tools in the productivity software used by millions of people.
Zoom has seized on AI to add value to video meetings. Last month, the company launched an AI sales tool <https://www.techtarget.com/searchunifiedcommunications/news/252515931/Zoom-l...> that analyzes video call transcripts, using data like the number of questions a customer asked to determine if they were engaged. The IQ for Sales feature is the first in a series of planned AI add-ons for Zoom.
_______________________________________________ nexa mailing list nexa@server-nexa.polito.it <mailto:nexa@server-nexa.polito.it> https://server-nexa.polito.it/cgi-bin/mailman/listinfo/nexa <https://server-nexa.polito.it/cgi-bin/mailman/listinfo/nexa>
participants (2)
-
gabriele elia -
Guido Noto La Diega