

“We know that we have state-sponsored actors interfering, we know the campaigns are going to play dirty tricks. “Campaigns are high stakes,” said Hany Farid, a generative AI expert at the University of California, Berkeley. AI developers warn that the technology’s rapid development and widespread deployment risks ushering in an epistemological dystopia that would undermine the foundations of representative democracy. And it’s another question, and a doubtful one at that, whether such evidence of some audio’s provenance will matter to partisan voters so ready to reject any datapoint that doesn’t conform to their worldviews.ĭeepfake audio, authentic-sounding but false recordings built from short snippets of a subject talking, have become so realistic that they can fool your own mother, presenting painfully obvious potential for underhanded political tactics. And while generative AI experts say they will most likely be able to detect charlatans, it would be impossible to prove a recording is real. Whether such a clip is real or the work of new, startlingly realistic generative AI models, the affected politician will call it a fake and evidence of the other side’s willingness to lie, cheat and steal their way to the White House. Or maybe the uproar will be over audio of former President Donald Trump saying something that his supporters find disqualifying. It may arrive in journalists’ inboxes from an anonymous whistleblower, or just go viral on social media.

The audio, a bit grainy and muffled as if it was recorded from a phone in someone’s pocket, will have the 80-year-old sounding confused, perhaps seeming to forget that he’s president, before turning murderously angry.

At some point in the months leading up to the 2024 election, a tape will leak that will confirm voters’ worst fears about President Joe Biden.
