Gadget and tech news and reviews
Beware of deepfakes in the AI era, urges Kaspersky

Beware of deepfakes in the AI era, urges Kaspersky

In our rapidly advancing digital age, the rise of artificial intelligence (AI) and machine learning presents both opportunities and challenges. Among these challenges, deepfakes stand out as a growing concern. These are AI-generated replicas of people, which can be in the form of speech, photos, or videos.

Kaspersky emphasises that while the creation of deepfakes can be time-consuming and labor-intensive, their potential for misuse is undeniable. Bethwel Opil, Enterprise Client Lead at Kaspersky in Africa, warns that as AI technology evolves, we can expect an increase in targeted deepfake attacks. These attacks could range from blackmail and financial fraud to spreading misinformation on social media.

The Dark Side of Deepfakes

Recent research by Kaspersky has revealed the availability of deepfake creation tools on darknet marketplaces. These tools are offered for various malicious purposes, including fraud, blackmail, and data theft. Shockingly, Kaspersky experts estimate that one minute of a deepfake video can be obtained for as little as $300.

Moreover, there is a significant digital literacy gap among internet users in Africa. According to the Kaspersky Business Digitisation Survey, only 25% of employees in the Middle East, Turkiye, and Africa (META) region could distinguish between a real image and an AI-generated one in a test. This lack of awareness puts organisations at risk, especially when employees are primary targets for phishing and social engineering attacks.

Real-World Risks

Imagine a cybercriminal creating a fake video of a CEO authorising a fraudulent wire transfer. Such deepfake videos can be used to deceive employees into compromising corporate funds. Opil highlights a real-life incident where a finance worker was tricked into transferring $25 million due to a deepfake video impersonating the company’s CFO.

Protecting Against deepfakes

To safeguard against the threats posed by deepfakes, Kaspersky recommends the following measures:

  • Stay Vigilant: Be wary of suspicious calls with poor sound quality or unnatural speech patterns.
  • Recognise Deepfake Characteristics: Look out for jerky movements, inconsistent lighting, and lips out of sync with speech in videos.
  • Verify Information: Never make decisions based solely on emotions. Always double-check information through multiple channels.
  • Update Cybersecurity Practices: Regularly review and enhance organisational cybersecurity protocols.
  • Stay Informed: Utilise tools like Kaspersky Threat Intelligence to stay updated on the latest deepfake trends.
  • Educate Employees: Foster a culture of awareness by educating employees about deepfakes and their potential risks.


As AI technology continues to advance, the threat of deepfakes looms. It’s crucial for individuals and organisations across Africa to be vigilant, informed, and proactive in protecting themselves against this growing cyberthreat. By following Kaspersky’s recommendations and investing in cybersecurity education, we can mitigate the risks posed by deepfakes and safeguard our digital future.



Photo by Tianyi Ma on Unsplash






References: ¹Kaspersky Business Digitisation Survey 2023 – Survey of 2,000 employees across SMBs & enterprises in the Middle East, Turkiye, and Africa.

²Deepfake Image Recognition Test – Respondents were asked to distinguish between a real image and a deepfake image of a popular American actor.

Clare Petra Matthes

Hi, I'm Clare and I am a freelance writer and Tech journalist as well as the owner and founder of where I review tech devices and also cover emerging technology news. Outside of I write for a number of publications and have regular tech slots on chaiFM radio station and eNCA's Tech Matters national breakfast TV news show.

Leave a Reply

Check out our Podcast below!


Clare Matthes

Clare is Gadget-Gal – and she’s technically amazing.


* indicates required



[instagram-feed feed=2]

– advertisement –