The Wired Politics Desk: How Artificial Intelligence has Changed Politics (and Why) it Has Been Used for a Million of People
In our project, we tracked uses of AI that, in some cases, have reached millions of people: For example, TikToks featuring an AI-generated image that made Prabowo Subianto, Indonesia’s former defense minister and now president-elect, seem cute and cuddly were viewed more than 19 billion times. (Subianto was at one point banned from the US for alleged human rights abuses.)
The electorate has to deal with the new tech. Deepfakes can be used for everything from sabotage to satire to the seemingly mundane: Already, we’ve seen AI chatbots write speeches and answer questions about a candidate’s policy. We have seen that Artificial Intelligence humiliates female politicians and make world leaders promote the benefits of passive income. Artificial intelligence can be used to tailor automated texts to voters.
Hi! I’m Vittoria Elliott. I am a reporter on the WIRED Politics desk and will be taking over for Makena to talk about politicians rising from the dead in India and the rapper endorsing opposition parties in South Africa.
Facebook Influence Networks are Running up against the Limits of Generative Artificial Intelligence (OpenAI): Russia, Iran, China, and Israel
Many Americans have their eyes set on November, but 2024 has already been a big election year for the rest of the world. India, the world’s largest democracy, is wrapping up its vote; South Africa and Mexico are both heading to the polls this week; and the EU is ramping up for its parliamentary elections in June. It’s the largest election year in history, and there are more people online than ever before.
In her research, the network would use real-seeming Facebook profiles to post articles, often around divisive political topics. “The actual articles are written by generative AI,” she says. “And mostly what they’re trying to do is see what will fly, what Meta’s algorithms will and won’t be able to catch.”
Today, OpenAI released its first threat report, detailing how actors from Russia, Iran, China, and Israel have attempted to use its technology for foreign influence operations across the globe. The report named five different networks that OpenAI identified and shut down between 2023 and 2024. Russia’s Doppleganger and China’s Spamoflauge are experimenting with the use of generative artificial intelligence in their operations according to the report. They are not very good at it.
It is a relief that these actors have not mastered generative intelligence to become unstoppable forces for the good of the world, but it is also clear that they are experimenting.
The OpenAI report reveals that influence campaigns are running up against the limits of generative AI, which doesn’t reliably produce good copy or code. It struggles with idioms—which make language sound more reliably human and personal—and also sometimes with basic grammar (so much so that OpenAI named one network “Bad Grammar.”) The Bad Grammar network was so sloppy that it once revealed its true identity: “As an AI language model, I am here to assist and provide the desired comment,” it posted.
One network used code from the ChatGPLt to build an automation system for Telegram, a chat app that has long been used by extremists and influence networks. Sometimes this worked, but other times it led to the same account posting as two different characters, giving away the game.
But influence campaigns on social media often innovate over time to avoid detection, learning the platforms and their tools, sometimes better than the employees of the platforms themselves. While these initial campaigns may be small or ineffective, they appear to be still in the experimental stage, says Jessica Walton, a researcher with the CyberPeace Institute who has studied Doppleganger’s use of generative AI.
The report paints a picture of several campaigns that were ineffective but still had crude propaganda, which seems to put to rest fears that the new technology could be used to spread misinformation during a crucial election year.