We (society, the government, social media companies, etc.) were clearly not prepared for the onslaught of fake news during the 2016 election. Since then, prevention and identification techniques have improved and become more accessible. However, fake news continues to evolve, and the newest threat might be deepfakes.
Deepfakes are essentially videos, images, or audio that are manipulated by software or artificial intelligence to produce media that appear to be real.
Deepfakes come in many different forms. Here are two common examples:
Cheap fakes (also known as dump fakes and shallow fakes) are related to deepfakes, but they involve slightly less sophisticated video manipulations. However, this does not diminish their impact.
A perfect example of cheap fakes (and how dangerous they can be) is 2019's viral video of Nancy Pelosi that was deliberately slowed down to make her sound drunk.
Additional reading on this video:
The idea of deepfakes has been around for a number of years. Similar techniques and technologies have especially been used in Hollywood.
The techniques to spot deepfakes are similar to the ones used to spot fake news.
When it comes to detecting deepfake videos BuzzFeed offers these 5 tips:
Companies have also created (or are in the process of creating) technology, like software, AI, etc., to help detect deepfakes, but there are concerns that they will never be enough.
The implications of deepfakes have many people worried. Deepfakes are getting better, more real, and easier to make.
Currently, deepfakes have not been extensively weaponized, especially in politics, though there are instances where they have been. However, the threat is there.
Browse the sources below to learn more.
Right now, the creators of deepfakes seem to be primarily content with using the technology for fun - like putting Nicholas Cage in as many movies as possible.
The Streaming Wars Roundtable (a deepfake example)
A roundtable discussion "featuring" Robert Downey Jr., George Lucas, Tom Cruise, Ewan McGregor, and Jeff Goldblum.