In the 1980s, the KGB employed a well-established tactic for disseminating disinformation globally. Oleg Kalugin, a former KGB general, reminisced about their preference for manipulating authentic documents by adding alterations. While the method remains largely unchanged, advancements in technology have accelerated the process.
In early March, a network of websites called CopyCop emerged, publishing articles in English and French on contentious topics. These articles accused Israel of war crimes, exacerbated divisive political debates in America, and circulated absurd stories about Polish mercenaries in Ukraine. What set these efforts apart was their use of large language models, likely from OpenAI, to modify content sourced from legitimate news outlets.
A probe by Recorded Future, a threat-intelligence company, revealed that these articles were translated and edited to inject a partisan bias. Shockingly, some articles still bore visible prompts, explicitly instructing the AI model to adopt a particular ideological stance. For instance, over 90 French articles were altered with instructions to take a conservative stance against the Macron administration.
This network has ties to DC Weekly, a known disinformation platform operated by John Mark Dougan, an American who fled to Russia in 2016. By March 2024, CopyCop had churned out over 19,000 articles across 11 websites, with some likely generated automatically. Despite their crude nature, these efforts have gained traction, especially when supplemented with human-produced content.
While current AI-enabled forgeries may not fool discerning readers, their potential for improvement is significant. Future iterations are less likely to reveal their incriminating prompts, posing a greater challenge to detection and mitigation efforts.
This evolution in disinformation tactics coincides with advancements in AI and blockchain technology, offering both challenges and opportunities. Blockchain’s immutable ledger could verify the authenticity of news sources and ensure the integrity of information dissemination. Integrating AI and blockchain could bolster defenses against fake news by detecting manipulation patterns and recording any changes transparently.
Cryptocurrencies, operating on blockchain, are also vulnerable to disinformation campaigns. False news can cause drastic price fluctuations, impacting market stability and investor confidence. Authenticating cryptocurrency-related news could mitigate such risks and foster a more stable market environment.
Moreover, blockchain’s decentralized nature could enhance media independence, making it harder for malicious actors to influence the narrative. As AI and blockchain continue to evolve, their role in combating disinformation will grow, presenting new opportunities and challenges in safeguarding the integrity of information dissemination.