A Microsoft Threat Analysis team warned that China will exploit AI-generated content in upcoming elections in India, South Korea, and the United States, reports Asian Lite News
With major elections taking place around the world this year, particularly in India, South Korea and the US, a Microsoft Threat Analysis team has warned that China will create and amplify AI-generated content to benefit its interests.
Despite the chances of such content in affecting election results remaining low, China’s increasing experimentation in augmenting memes, videos, and audio will likely continue – and may prove more effective down the line.
According to the tech giant, China is using fake social media accounts to poll voters on what divides them most to sow division and possibly influence the outcome of the US presidential election in its favour.
“China has also increased its use of AI-generated content to further its goals around the world. North Korea has increased its cryptocurrency heists and supply chain attacks to fund and further its military goals and intelligence collection. It has also begun to use AI to make its operations more effective and efficient,” the company said in a blog post.
Deceptive social media accounts by Chinese Communist Party (CCP)-affiliated actors have already started to pose contentious questions on controversial US domestic issues to better understand the key issues that divide US voters.
“This could be to gather intelligence and precision on key voting demographics ahead of the US presidential election,” the company warned.
China’s geopolitical priorities remain unchanged but it has doubled down on its targets and increased the sophistication of its influence operations (IO) attacks.
The Taiwanese presidential election in January this year also saw a surge in the use of AI-generated content by China-affiliated cyber criminals.
“This was the first time that Microsoft Threat Intelligence has witnessed a nation-state actor using AI content in attempts to influence a foreign election,” said the team.