Deepfakes are the new dangerous propaganda tool

What you need to know:

  • Like most fake news, the ability of a deepfake to go viral within a very short period of time can create irreversible damage to a person’s character or quest.

  • Deepfakes are dangerous because of what the trend portends to truth as we know it and the repercussions on processes and systems that rely on conveyance to function.

A fool-proof fake video of a popular politician endorsing a rival candidate going viral just a day to elections?

Better still, a mock-up of a mainstream daily newspaper breaking infallible ‘fake news’ of a highly regarded public figure in a compromising situation.

Far off? Wrong, the reality is closer than you think.

FALSIFIED

A technology capable of delivering the above two scenarios is within grasp and nobody seems to know what to do about it.

Welcome to the world of ‘deepfakes,’ a phenomenon that is taking the internet by storm.

Deepfake refers to a video, an audio, an image or a presentation that has been enhanced by artificial intelligence (AI) and other modern technology tools to present fool-proof falsified results.

Deepfake manifests in the form of pictures, videos or audios of a celebrity, a politician or known public figure saying or doing something that they never did. The videos, images or audios are believable to the human eye and ears as original accounts of something that happened while in actual sense they are fake.

Some recent examples include a video of Facebook boss Mark Zuckerberg supposedly pledging his allegiance to Spectre, an evil organisation mentioned in the James Bond movies. Of course, Zuckerberg never pledged such an allegiance. Others who have been featured in deepfakes include US President Donald Trump and celebrity Kim Kardashian.

Locally, we are yet to witness any high-profile deepfake incident but one can almost bet that it is just a matter of time before such a video or audio surfaces among the early adopters of the technologies—mostly political circles or rival corporate brands.

DAMAGE

The fear is that by the time a deepfake is rolled out in Kenya, it will have leapfrogged in complexity in such a way that it will be almost impossible to verify. The trouble with Deepfake is that it is a propaganda tool on steroids. Deepfake circumvents all available technology and is impossible to detect.

Deepfakes blur the truth and deliberately spread falsehoods under the guise of truth. The deepfake precedent means that people might take messages from such videos or audios at face value as it is almost impossible to differentiate the real versus the unreal.

Like most fake news, the ability of a deepfake to go viral within a very short period of time can create irreversible damage to a person’s character or quest.

Deepfakes are dangerous because of what the trend portends to truth as we know it and the repercussions on processes and systems that rely on conveyance to function.

Only little bits of fake news is needed to disrupt a conversation and change perceptions. Once the confusion over fake or real sets in among the general populace, there is risk that the population will stop trusting the validity of any video at all—even those containing true information.

EDITING

The other concern with deepfake is that it is readily available to anyone with internet and some basic audio-visual editing skills. Anyone anywhere can create a falsehood and take it across the world in a matter of seconds. In the US, legislators are already exploring ways to tame the deepfakes issue. Technology experts across the world are also looking for ways to ensure that deepfakes videos are identified immediately they hit social media.

Closer home, the next Kenyan elections are three years away and deepfakes will most definitely present a significant challenge when the time comes.

Is it now time for strategic communication and technology specialists in Kenya to come together to counter deepfakes?

The writer is General Manager and Founder of PVG Kenya, a strategic communications and marketing agency based in Nairobi. [email protected]