AI-powered face swapping has taken a dystopian turn Now it's being used to make revenge porn of people's friends and acquaintances. <https://www.engadget.com/2018/01/26/ai-face-swap-deepfakes/> As if the issues like revenge porn and AI-powered facial recognition searches <https://www.engadget.com/2017/10/11/pornhub-computer-vision-ai-database/> weren't creepy enough, a /Motherboard/ <https://motherboard.vice.com/en_us/article/ev5eba/ai-fake-porn-of-friends-de...> report reveals yet another unsettling use of technology: "deepfakes." Within a month of locating a Redditor who used machine learning to swap pictures of mainstream actresses onto the bodies of women performing in porn movies, the outlet has found people using an app based on his techniques to create videos using images of women they know. By scraping social media accounts for a full library of photos and using web apps that find porn with women who have faces that resemble the person they're basing it on, it's automating a process that some revenge porn sites had already been doing manually. We've seen similar technology used in movies for years <https://www.engadget.com/2016/12/27/how-rogue-one-resurrected-a-dead-actor-a...>, but with AI running on desktop GPUs or using cloud computing, random people suddenly have access and are using it in unsettling ways (like the Nicholas Cage-on-Amy Adams scene shown here). [...] e anche People Can Put Your Face on Porn—and the Law Can't Help You <https://www.wired.com/story/face-swap-porn-legal-limbo/> The grosser parts of the internet have a new trick: Using machine learning and AI to swap celebrities’ faces onto porn performers’. The result? Fake celebrity porn seamless enough to be mistaken for the real thing. Early victims include Daisy Ridley, Gal Gadot, Scarlett Johansson, and Taylor Swift. Originally reported by Motherboard <https://motherboard.vice.com/en_us/article/gydydm/gal-gadot-fake-ai-porn>, this nasty trend has been brewing for months, acquiring its own subreddit. And now that someone has made an app <https://motherboard.vice.com/en_us/article/bjye8a/reddit-fake-porn-app-daisy-ridley>—drastically lowering the technical threshold would-be creators have to clear— it’s presumably about to become much more prevalent. For reasons that are eye-poppingly obvious, these videos—which their creators refer to as "deepfakes," after the redditor who created the process—are terrible. It’s a noxious smoothie made of some of today's worst internet problems. It’s a new frontier for nonconsensual pornography and fake news alike. (Doctored videos of political candidates saying outlandish things in 3, 2... .) And worst of all? If you live in the United States and someone does this with your face, the law can’t really help you. [...] Domande: Cosa resterà del video (ma soprattutto della registrazione audio) come prova processuale se questa può essere costruita senza fatica e spesa con una app? Magari la perizia la facciamo fare ad una AI? E fuori dal processo? Cosa resterà del giornalismo di inchiesta e denuncia? "quello nel video non ero io, è un fake" Chi tra voi avvocati sarà il primo in Italia a contestare l'attendibilità di un video portato come prova ? Alberto