Share this @internewscast.com
CHICAGO () — As the 2024 presidential race heats up, politicians are turning to artificial Intelligence to enhance their campaign efforts; and Florida Gov. Ron DeSantis appears to be the first presidential hopeful to use the technology in a political ad.
The 2024 presidential candidate’s campaign published an attack ad on social media Monday that included fake AI images of former President Donald Trump and Dr. Anthony Fauci hugging. The ad was aimed at attacking the former president and alleged Trump’s support of Fauci, who took heat from the GOP in response to his handling of the COVID-19 pandemic.
With fears that fake AI images might manipulate voters and flood the internet with false information, Twitter promptly added a “context” bubble to the post to inform readers of the misinformation. Twitter’s move to add context to a post, rather than delete it, may have come in response to the social platform’s alleged censorship during the 2020 presidential election, Hunter Biden’s laptop controversy and the pandemic.
The three AI-generated images in DeSantis’ attack ad were placed among three real images of Trump and Fauci from 2020, according to AFP, meant to make the viewer believe these images were real.
“It was particularly sneaky to intermix the real and the fake images as if the presence of the real image would give more credibility to the other images,” Hany Farid, a digital forensics expert and professor at the University of California, Berkeley, told NPR.
However, DeSantis’ video faced backlash from members of his own party. Sen. J.D. Vance (R-Ohio) responded to the ad in a tweet, calling the use of fake AI images “completely unacceptable.”
“Smearing Donald Trump with fake AI images is completely unacceptable. I’m not sharing them, but we’re in a new era. Be even more skeptical of what you see on the internet,” Vance said.
But this wasn’t the first time AI had been used in political ads.
Earlier this year, the Republican National Committee put out an ad made with artificial intelligence slamming President Joe Biden.
Axios reported the ad was the first time the RNC had produced a video that is 100% AI, meaning it used software with images meant to look and feel real to voters, even if they are not.
In the video, it asks what would happen if Biden and Vice President Kamala Harris get re-elected — and then proceeds to show AI-generated images of economic turmoil, crime and domestic and international crises.
Sam Cornale, executive director of the Democratic National Committee, criticized the ad in a Twitter post responding to Axios’ story.
“When your operative class has been decimated, and you’re following MAGA Republicans off a cliff, I suppose you have no choice but to ask AI to help,” he said.
Sophisticated generative AI tools can now create cloned human voices and hyper-realistic images, videos and audio in seconds, at a minimal cost. When strapped to powerful social media algorithms, this fake and digitally created content can spread far and fast and target highly specific audiences, potentially taking campaign dirty tricks to a new low.
The implications for the 2024 campaigns and elections are as large as they are troubling: Generative AI can not only rapidly produce targeted campaign emails, texts or videos, but it also could be used to mislead voters, impersonate candidates and undermine elections on a scale and at a speed not yet seen.
“We’re not prepared for this,” warned A.J. Nash, vice president of intelligence at the cybersecurity firm ZeroFox. ”To me, the big leap forward is the audio and video capabilities that have emerged. When you can do that on a large scale, and distribute it on social platforms, well, it’s going to have a major impact.”
The Associated Press and Cassie Buchman contributed to this report.