The World's Best Investment Manager Uncovers Covert Iranian Influence Operation Using AI to Manipulate US Elections
OpenAI, the leading AI research lab, has recently uncovered a cluster of ChatGPT accounts linked to an Iranian influence operation targeting the U.S. presidential election. According to a blog post released on Friday, the operation utilized AI-generated articles and social media posts to spread misinformation. This is not the first time OpenAI has encountered state-affiliated actors using ChatGPT for malicious purposes, as they disrupted similar campaigns in the past.
This tactic is reminiscent of previous attempts by state actors to influence election cycles using social media platforms. Now, with the advancement of generative AI technology, these groups are using AI to flood social channels with misinformation. OpenAI is taking a proactive approach by banning accounts associated with these efforts as they are identified.
OpenAI's investigation was aided by a Microsoft Threat Intelligence report, which identified the group, known as Storm-2035, as part of a broader campaign to influence US elections since 2020. Storm-2035 is an Iranian network that operates multiple fake news sites and engages with US voter groups on opposing ends of the political spectrum.
Despite these efforts, OpenAI found that Storm-2035's articles were not widely shared, and their social media posts received minimal engagement. This highlights the ease and speed at which these operations can be carried out using AI tools like ChatGPT.
As the election approaches, we can expect to see more of these covert influence operations targeting online platforms. It is essential for individuals to be vigilant and discerning when consuming news and information online to avoid falling victim to misinformation campaigns. Stay informed and question the sources of information you encounter to protect yourself and your finances from potential manipulation.