Skip to main contentSkip to navigationSkip to navigation
building with a golden dome
The Georgia capitol in Atlanta. Photograph: Bloomberg/Getty Images
The Georgia capitol in Atlanta. Photograph: Bloomberg/Getty Images

Georgia lawmakers are using an AI deepfake video to try to ban political deepfakes

This article is more than 1 month old

A bill working its way through state legislature seeks to stop use of AI tech in politics – but some far-right activists are opposed

When wrangling legislation, sometimes it’s best to sound out a problem in front of you.

In Georgia, it sounds like the state senator Colton Moore. But it only sounds like Colton Moore.

Brad Thomas, a Republican state representative and vice-chairman of the Georgia house committee on technology and infrastructure innovation, has proposed legislation outlawing the use of artificial intelligence deepfakes in political communication. To illustrate the point, Thomas presented a deepfake video to the judiciary committee using an AI image and audio of Moore and Mallory Staples, a former Republican congressional candidate who now runs a far-right activist organization, the Georgia Freedom caucus.

The video uses an AI tool to impersonate the voices of Moore and Mallory falsely endorsing passage of the bill. The video contains a continuous disclaimer at the bottom citing the text of the bill.

Moore and Mallory oppose the legislation.

The AI impersonation of Moore says: “I would ask the committee: how is using my biometric data, like my voice and likeness, to create media supporting a policy that I clearly don’t agree with the first amendment right of another person?”

The video continues: “The overwhelming number of Georgians believe the use of my personal characteristics against my will is fraud, but our laws don’t currently reflect that. If AI can be used to make Colton Moore speak in favor of a popular piece of legislation, it can be used to make any one of you say things you’ve never said.”

Brad Thomas, the Republican co-sponsor of the bill and co-author of the video, said he and his colleagues used commonly available tools to create the video.

“The particular one we used is, like, $50. With a $1,000 version, your own mother wouldn’t be able to tell the difference,” he said.

The pace of advancement of visual AI generative tools is years ahead of the legislation needed to prevent abuses, Thomas said: “Cinematography-style video. Those individuals look absolutely real, and they’re AI-generated.”

The bill passed out of committee on an 8-1 vote.

Moore is not popular in Georgia’s legislative circles. His peers in the state senate threw him out of the Republican caucus in September, accusing him of making false statements about other conservatives while he was advocating fruitlessly for a special session to remove the Fulton county prosecutor Fani Willis from office.

Last week, Moore was permanently barred from the Georgia house chamber after rhetorically attacking the late speaker at a memorial service being held on the house floor.

Through the Georgia senate press office, Moore declined to comment.

In social media posts, Moore has voiced opposition to this bill, which he said is an attack on “memes” used in political discourse, and that satire is protected speech.

Staples, in newsletters to her supporters, cited the federal conviction of Douglass Mackey last year as an example of potential harms. Mackey, also known as the alt-right influencer “Rickey Vaughn”, sent mass text messages in November 2016 encouraging Black recipients to “vote by text” instead of casting a real vote, with the texts claiming they had been paid for by the Clinton campaign.

Federal judges rejected Mackey’s first amendment arguments on the ground that the communications amounted to acts of fraud which were not constitutionally protected. Mackey was sentenced in October to serve seven months.

House bill 986 creates the crimes of fraudulent election interference and soliciting fraudulent election interference, with penalties of two to five years in prison and fines up to $50,000.

If within 90 days of an election, a person publishes, broadcasts, streams or uploads materially deceptive media – defined as appearing to depict a real individual’s speech or conduct that did not occur in reality and would appear to a reasonable person to be authentic – they would be guilty of a felony, as long as the media in question significantly influences the chances for a candidate or referendum to win, or confuses the administration of that election. Thus, it would also criminalize using deepfakes used to cast doubt on the results of an election.

Deepfakes entered the 2024 election at its start, with an AI-generated audio call featuring Joe Biden telling New Hampshire voters not to vote. After the call, the Federal Communications Commission announced a ban on robocalls that use AI audio. But the Federal Elections Commission has yet to put rules in place for political ads that use AI, something watchdog groups have been calling for for months. Regulations are lagging behind the reality of AI’s capabilities to mislead voters.

In the absence of federal elections rules for AI content, states have stepped in, filing and, in several instances, passing bills that typically require labels on political ads that use AI in some way. Without these labels, AI-generated content in political ads is considered illegal in most of the bills filed in states.

Experts say AI audio, in particular, has the ability to trick voters because a listener loses context clues that might tip them off that a video is fake. Audio deepfakes of prominent figures, such as Trump and Biden, are easy and cheap to make using readily available apps. For less well-known people who often speak publicly and have a large volume of examples of their voices, like speeches or media appearances, people can upload these examples to train a deepfake clone of the person’s voice.

Enforcement of the Georgia law might be challenging. Lawmakers struggled to find ways to rein in anonymous flyers and robocalls spreading misinformation and fraud ahead of elections long before the emergence of AI.

“I think that’s why we gave concurrent jurisdiction to the attorney general’s office,” Thomas said. “One of the other things we’ve done is allow the [Georgia bureau of investigation] to investigate election issues. Between the horsepower of those two organizations, we have the highest likelihood of figuring out who did it.”

Lawmakers are only just starting to get at the implications of AI. Thomas expects more legislation to emerge over the next few sessions.

“Fraud is fraud, and that’s what this bill is coming down to,” Thomas said. “That’s not a first amendment right for anyone.”

Rachel Leingang contributed reporting

Most viewed

Most viewed