Alarms are blaring about artificial intelligence (AI) deepfakes that manipulate voters, like the robocall sounding like President Joe Biden that went to New Hampshire households, or the fake video of Taylor Swift endorsing Donald Trump.
Yet, there’s actually a far bigger problem with deepfakes that we haven’t paid enough attention to: deepfake nude videos and photos that humiliate celebrities and unknown children alike. One recent study found that 98 per cent of deepfake videos online were pornographic, and that 99 per cent of those targeted were women or girls.
Already a subscriber? Log in
Read the full story and more at $9.90/month
Get exclusive reports and insights with more than 500 subscriber-only articles every month
ST One Digital
$9.90/month
No contract
ST app access on 1 mobile device
Unlock these benefits
All subscriber-only content on ST app and straitstimes.com
Easy access any time via ST app on 1 mobile device
E-paper with 2-week archive so you won't miss out on content that matters to you