Tech

Fake videos could be the next big problem in the 2020 elections

Key Points
  • Fake news was a big problem for the 2016 election. "Deepfake" videos could be an even bigger problem in 2020.
  • States including California and Texas have enacted laws that make deepfakes illegal when they're used to interfere with elections.
  • "What we are seeing now is that (cyberwar) has a twin called "likewar," the hacking of people on social networks, by driving ideas viral through likes, shares, and lies," said Peter Singer, cybersecurity strategist at New America think tank.
A woman views a manipulated video of President Donald Trump and former president Barack Obama, illustrating how deepfake technology can deceive viewers.
Rob Lever | AFP | Getty Images

Fake news was a big problem for the 2016 election. "Deepfake" videos could be an even bigger problem in 2020.

Deepfake technology can be used to create videos that seem to show politicians saying things they never said, or doing things they never have done. The technology first gained widespread attention in April 2018, when comedian Jordan Peele created a video that pretended to show former President Barack Obama insulting President Donald Trump in a speech.

The technology is a problem not only because the videos are fake and easy make, but also because like "fake news" articles on social media, they are likely to be shared.

"Deepfakes can be made by anyone with a computer, internet access, and interest in influencing an election," said John Villasenor, a professor at UCLA focusing on artificial intelligence and cybersecurity. He explained that "they are a powerful new tool for those who might want to (use) misinformation to influence an election."

How easy is it to make a deepfake video?
VIDEO12:1112:11
How easy is it to make a deepfake video?

Experts warn that deepfakes can weaponize false information and, because of the ease of creating fake content, videos can be made and distributed promptly, allowing fake videos to reach millions in seconds.

The term "deepfakes" refers to manipulated videos or other digital representations produced by sophisticated artificial intelligence that yield seemingly realistic, but fabricated images and sounds.

2020 elections

Paul Barrett, adjunct professor of law at New York University, explained that there are two ways deepfake videos could affect elections.

For one, Barrett said, "a skillfully made deepfake video could persuade voters that a particular candidate said or did something she didn't say or do."

What we are seeing now is that (cyberwar) has a twin called 'likewar,' the hacking of people on social networks, by driving ideas viral through likes, shares, and lies.
Peter Singer
senior fellow, New America

A video released on Facebook in June appeared to show House Speaker Nancy Pelosi stumbling through a speech when, in reality, she did not.

Villasenor told CNBC that deepfakes can undermine the reputations of politicians and easily influence voter sentiment, making them very dangerous, yet "powerful."

"If there are a multitude of deepfakes over the course of an election campaign, voters could grow cynical about the ability to tell truth from falsehood. Cynicism could lead to apathy, low voter turnout, and disillusionment with the entire political system," said NYU's Barrett.

Domestic disinformation

It currently is not a federal crime in the U.S. to create fake videos. But "using a fake video to commit another crime — such as extortion or fraud or harassment — would be illegal under the laws covering the other crimes," said Barrett.

He added that the legality of creating deepfakes could change in the future, as a number of bills hoping to curb their use have been introduced in Congress.

The first federal bill targeting deepfakes, the Malicious Deep Fake Prohibition Act, was introduced in December 2018. Meanwhile, states including California and Texas have enacted laws that make deepfakes illegal when they're used to interfere with elections.

In June, the DEEPFAKES Accountability Act, short for "Defending Each and Every Person from False Appearances by Keeping Exploitation Subject to Accountability Act," was introduced. If passed, it would require that creators of false videos to label them as such or face up to five years in prison.

"Indeed, the technology can be used for both entertainment, business, and politics, so it is unlikely to be outlawed ever completely," said Peter Singer, cybersecurity and defense focused strategist and senior fellow at policy think tank, New America.

He added that although it is legal, deepfakes should be labeled to let viewers know what they're seeing is a simulation. "Just as @realdonaldtrump has a small blue check on his account to let you now that it is him," Singer wrote.

Deepfake technology is on the rise as data shows most Americans are worried about fake news.

Nearly seven-in-ten (68%) say made-up news and information greatly affect Americans' confidence in government institutions, according to a 2019 survey conducted by Pew Research Center. About half (54%) of the 6,127 respondents said misinformation has impacted Americans' confidence in each other.

That survey also found that half of respondents see false news as a big problem for the country. That's a bigger share than those who said they viewed terrorism (34%), illegal immigration (38%), racism (40%) and sexism (26%) as top issues in the U.S.

As deepfakes grow, Facebook, Twitter and Google are working to detect and prevent them
VIDEO12:5612:56
The rise of deepfakes and how Facebook, Twitter and Google work to stop them

On the corporate side, social media behemoth Facebook was criticized for not being able to identify fake videos when the Pelosi video circulated.

In response, Facebook and Microsoft promised to collaborate with top universities across the country and create a large database of fake videos to study detection methods.

'Hostilities that never really happened'

The danger of the technology goes beyond domestic borders. Because it's easy to create and distribute on the internet, malicious governments or other actors could use deepfakes to interfere with elections in other countries — as the U.S. intelligence community alleges Russia did in 2016.

And the possible malicious uses don't end at the voting booth.

"Deepfakes could be used internationally to embarrass or incriminate heads of state or other prominent politicians," said NYU's Barrett. "They could also be exploited to help start military conflict by falsely portraying hostilities that never really happened."

Looking ahead, NYU Stern Center for Business and Human Rights published a report that said the 2020 elections could potentially see interference from China, Russia and Iran.

"For the last 15 years, business, governments, and political parties have had to face the threat of the hacking of their computer networks, commonly called cyberwar. What we are seeing now is that it has a twin called 'likewar,' the hacking of people on social networks, by driving ideas viral through likes, shares, and lies," said Singer.