Dailymaverick logo

South Africa

South Africa, World

Can Artificial Intelligence on steroids hijack an election?

Can Artificial Intelligence on steroids hijack an election?
In the developed world, the arrival of AI as a weapon of election influence is a certainty, as is the certainty of AI tools becoming increasingly and astonishingly sophisticated.

About 49% of the world is voting in elections this year. Not all these elections have been, or will be free or fair — witness Russia’s unashamedly rigged presidential election or Pakistan’s national election while its most popular politician, Imran Khan, is incarcerated. Even so, there will be many other, more reasonable, contests, from the US to South Africa, where ballot box markings and their subsequent tallies are expected to be a fair expression of citizen preference. 

However, even in the best of circumstances and in the most democratic of countries, elections bring out the worst in terms of spin and propaganda. Watching the parade of contesting candidates trafficking in half-truths, exaggerations, p-hacking and outright falsehood is dispiriting at best, partially because we, the voters, are not much better. We listen to and vote for the candidate who reinforces our preconceived world-views and biases, and we generally turn a deaf, even contemptuous, ear to the other side of the argument. 

It would be easy for me to say that this pattern explains the many voters in thrall to Trump in the US or devoted to the eternally disappointing ANC in South Africa, but there are plenty who could make the same accusation from the other side. It is in the nature of elections that they are a tragic testament both to the necessity of politicians to manipulate us and the ease with which we allow ourselves to be manipulated. 

Nothing new here. 

Except for artificial intelligence (AI), which is about to supercharge a further debasement of electioneering, starting this year. 

At the recent Wall Street Journal CEO Council Summit, the CEO of the AI company Cohere, Aidan Gomez, commented, “Humans are kind of the average of the ideas they are presented with.” He was saying this in relation to recommendation algorithms from social media behemoths like TikTok which command the attention of hundreds of millions of people for hours every day. The algorithm decides what they see and uses the psychology of addiction to make sure they keep watching. 

And now, this same weapon is going to be let loose on the world during the many upcoming elections, and it is going to be scaled up and pumped full of AI steroids. 

Why now?

Because right now there is a kind of perfect storm happening. 

At its base is the fact that social media is currently where most people in the world get their news, opinions and content. The grand old world of curated news and analysis — from the likes of The Economist, The New York Times, Financial Times or The Washington Post — has given way to the shouting of the hoi polloi, the noisy and chaotic barking of millions of uninformed or cynical voices freely broadcasting on a global platform without paywalls or editorial oversight. (Paywalls solicit money from readers to pay the best journalists and editors at the cost of a far more restricted audience. The New York Times’ reputation as the paper-of-record may still be intact, but it is now an empty honorific of interest only to an elite audience.) 

The second component of the perfect storm is the excellence and efficacy of the algorithms I mentioned above. If social media platforms can get people to stare glassy-eyed at lipstick or bicycle accident videos, they can easily do the same when it comes to political propaganda. 

Deformed facts and fine fiction


The last and most important element is our recently acquired ability to create and amplify misinformation and/or disinformation at hyperscale. Using AI, bad actors can churn out an almost uncountable number of tweets, threads, news feeds, videos and audio clips of dubious provenance. Most people do not check anything. They simply consume them, believe them and act on them. This, of course, happened before AI — it is well documented that the Russians built websites to interfere with the US elections of 2016 and 2020 — but most of these efforts were clumsy, amateurish and forgettable. 

In contrast, AI now has the ability to produce a tsunami of diverse, personalised, well-written and compelling truth-stunted propaganda, replete with deformed facts, fine fiction, deepfake audio and AI-generated video clips. This stuff is about to wash over us all, and we will have no tools to separate the wheat from the chaff. 

It has long been a concern within the AI community that the grand promises lying on the AI horizon (in healthcare, biology, physics, agriculture, education, etc) may soon be overshadowed by a small group of bad actors weaponising AI with malicious intent. Many discussions have centred on the dangers of autonomous warfare, biological weapons, AI-assisted hacking or copyright theft.

It seems to me that AI-generated political propaganda will be far more pernicious.

Most importantly, it is not generally illegal to lie, at least not in the public square. A single critical word undetectably changed in a candidate’s speech and rebroadcast widely might easily sidestep a legal challenge. (“It was satire, your honour.”) Outright fibs in political propaganda can also avoid sanction under the protection offered by the laws of free speech. It is difficult to find the warm body behind AI-generated content; it is extremely easy for the human puppet masters to hide in the shadows. 

It has already started, as reported by CNN. In January’s New Hampshire primary a deepfaked, Biden-voiced robocall urged voters to “save their vote for the November presidential election” (it is not clear how many people fell for this, but even so). In Slovakia, an audio clip was released that had the leading candidate boasting about how he rigged the election, and (in a later clip), how he was going to raise the price of beer. 

The only good news at this point, at least for South Africa, is that most political parties (and especially the ruling party) do not have the competence to do this (if you’ve ever tried to interact with a government website, I’m sure you’ll agree). However, in the developed world, the arrival of AI as a weapon of election influence is a certainty, as is the certainty of AI tools becoming increasingly and astonishingly sophisticated. And they are easy to create and deploy — any half-decent computer science graduate can do it. 

Aidan Gomez’s comment at the Wall Street Journal CEO Summit bears repeating: “Humans are kind of the average of the ideas they are presented with.” That is going to be the mantra fuelling AI election hacking; we will be presented with many more bad ideas than good ones. 

Surreptitious lies, finely worded half-truths and cunning distortions will be coming at us with the next election. 

I would say be cautious, but I am not sure how we do that. DM  

Steven Boykey Sidley is a professor of practice at JBS, University of Johannesburg. His new book, It’s Mine: How the Crypto Industry is Redefining Ownership, is published by Maverick451 in SA and Legend Times Group in UK/EU, available now.