Misinformation Comes From Deepfakes

A few weeks ago a video of the former president of the United States, Barack Obama, circulated online, referring to Donald Trump in a derogatory manner with an insult. We could also see a comic speech by Richard Nixon from the oval office. The two videos have something in common: neither is real. Both pieces were generated with deepfake technology, which uses artificial intelligence to change the audio or video and give the impression that someone said or did something they didn't really do. This tool is already a few years old, but now it is getting results that are difficult to distinguish from reality. There is widespread concern that deepfakes have the ability to manipulate reality in environments related to politics or business, to destroy the veracity of institutions that may have little credibility.

The deepfakes began as an artificial intelligence training project, so that computers could create artificial neural networks in order to perform and improve, layer by layer, a job identifying and replacing facial elements. On the other hand, a deepfake audio uses a real recording of a specific person and introduces it into a similar process that is able to isolate the sounds and intonations to be able to say anything with the characteristic timbre of the protagonist.

A Dutch company, called Deeptrace, detects and monitors deepfakes. In September he released a study in which he said that 96% of deepfakes on the internet are used for pornography, with the intention of introducing faces of well-known characters into scenes that had never taken place. At the beginning of 2019, an English company paid more than 200,000 euros to criminals who had manipulated the voice of their boss with statements sensitive to the company's price.

RELATED

For now, there is a growing fear that deepfakes could destroy a reputation in seconds of manipulated footage, or tip the scales in tight elections, or even force a rise in the shares of a certain company. The possibilities of use are very high and we will see more and more cases of use. Politicians and senior business leaders are two of the main risk profiles, for the number of recordings of them in public. When a video goes viral on the internet, even if it is not real, it is very difficult to contain and, in addition, it is increasingly difficult to distinguish reality from manipulation.

SEARCH FOR MORE

YOU MAY ALSO LIKE

Leave a Reply

Your email address will not be published. Required fields are marked *

− 4 = 2