Pragmatic Papers Logo

Russia’s AI Disinformation Machine

by u/Case_Newmark

From fake ‘local news websites to armies of automated social media accounts, Russia has launched a digital offensive that demonstrates the evolution of disinformation into a high-tech weapon, aimed at reshaping elections and undermining trust worldwide.

Pro-russian bot farm in Ukraine - By Mvs.gov.ua, CC BY 4.0, https://commons.wikimedia.org/w/index.php?curid=121487328

Recent investigations highlight a significant expansion of Russia’s online influence operations, blending sophisticated cyber tactics with large-scale disinformation campaigns. Experts in Foreign Affairs describe a burgeoning cyber empire in which Moscow is exporting its technology and propaganda globally. These trends signal an intensification on the new age digital front in Russia’s hybrid warfare strategy, fought by using automated bots, false narratives and foreign-based technology to shape perceptions abroad.


AI-Powered Disinformation Networks in the U.S.


Investigations have revealed sprawling networks of U.S.-based websites that masquerade as local news but spew sensational, misleading political content. The BBC reported on July 2nd that "a network of Russia-based websites posing as U.S. newspapers” was running an AI-assisted fake-news campaign aimed at the American presidential race. These sites will often reuse genuine news articles, rewriting them with AI to inject false or partisan details. For example, one fabricated story claimed Ukraine’s First Lady secretly spent U.S. aid on a $4.8 million Bugatti. This is a hoax “filled with errors” that nonetheless reached at least 12 million users on social media.

A media analysis firm, NewsGuard, now identifies roughly 1,200 such fake local outlets in the U.S. - more than the number of real local newspapers remaining.

Journalists have dubbed this trend "pink slime" journalism; low-cost sites that pretend to have a local presence, but crank out low-quality fake news with no bylines and no accountability. In practice, they use authentic-sounding names for site, while subtly twisting stories on hot-button issues in order to inflame readers.

US authorities have also traced many of these sites to specific operators. Notably, a Washington Post/Verge investigation found that John Mark Dougan, a former Florida sheriff’s deputy who fled to Moscow, was financed by Russia’s military intelligence (the GRU) to create dozens of fake “local” news sites (such as DC Weekly, Chicago Chronicle, and Atlanta Observer).

John Mark Dougan A.K.A. "рукав Путина" - By Ruphotog - Own work, CC BY-SA 4.0, https://commons.wikimedia.org/w/index.php?curid=52279852

Under GRU direction, Dougan and his associates mixed real and fabricated content tailored to U.S. audiences. The Kremlin even tried to spoof popular US news outlets like Fox News and the Washington Post, fooling American voters into reading Kremlin-produced news content.

Disinformation researchers point out that some of Dougan’s creations went truly viral. For instance, a fake deepfake video smearing politician Tim Walz gained nearly 5 million views on X in 24 hours. This video was officially attributed by U.S. intelligence as “created by Russia”.

Dougan himself denies wrongdoing but hinted at his motives in a statement to the BBC. In it, he admittedfor me it’s a game” and “a little payback” against what he saw as Western lies. Experts warn that even when each fake story is viewed by relatively few people, the cumulative effect can be large. False narratives will spread like wildfire, with Russian bots pouring gasoline over the burning remnants of our democracy.


Russian Influence Abroad

Russia’s online influence operations have similarly erupted beyond its borders. Just weeks before Germany’s February 2025 federal vote, cybersecurity researchers documented Russian-linked networks sowing panic. Reuters reported that these groups have been pushing faked spy agency warnings of terrorist attacks in Germany, a series of entirely fabricated alerts designed to sow fear and depress voter turnout.

Many of these posts even hijack official logos (for example, a hoax video used the France24 logo) to appear authentic. German analysts warn that this strategy is twofold. The first step is to scare the public away from the polls. The second, to shift debate away from policy issues (notably support for Ukraine) toward law-and-order angst.

Reuters notes these networks also spread unfounded rumors to discredit mainstream candidates, such as attacking conservative candidate Friedrich Merz for example, while implicitly helping the Kremlin-friendly Alternative for Germany (AfD) party. As Felix Kartte of the Mercator Institute observes, “a scared society is much more sensitive to authoritarian narratives”. By driving Germans into a panic about security, the operatives hope more people vote by mail (alleging fraud later) and focus less on foreign policy.

Alice Weidel, co-leader of the AfD. By Olaf Kosinsky - Own work, CC BY-SA 3.0 de, https://commons.wikimedia.org/w/index.php?curid=58305291

Similar interference has been detected in other European campaigns. France’s watchdog agencies uncovered a Kremlin-linked “Doppelgänger” influence operation during the 2024 elections. Analysts found dozens of fake French news sites and social accounts that lay dormant until just before voting. When activated, they pumped out blatant propaganda. One bogus page pretending to be from Le Point carried the headline: “Our leaders have no idea how ordinary French people live but are ready to destroy France in the name of aid for Ukraine”.

Another fake site posed as Macron’s own party, promising voters €100 to cast ballots for him,then redirecting readers to the genuine campaign website.

Recorded Future experts even found that the operators were embedding AI prompts in the site code to auto-rewrite articles with a conservative slant. France’s cybersecurity agency Viginum has bluntly warned that such “digital information manipulation campaigns have become a veritable instrument of destabilization of democracies”.

Across Europe, the footprint of these influence cells is clear. EU intelligence agencies have been tracking Kremlin “Doppelgänger” networks in multiple countries. Czech and Belgian authorities say they dismantled similar Russian-backed networks around last year’s European Parliament elections, and Romania even canceled its December 2024 presidential vote after uncovering an "aggressive" hybrid interference plot.

In sum, analysts note that dozens of “pseudo-news” sites and bot accounts lie in wait across the continent, ready to inundate social media with fear-mongering content at any critical moment. By manufacturing security scares, cloning outlet websites, and flooding platforms with AI-generated stories, these operations aim to pit voters against each other, and undermine confidence in democratic processes.


Putin’s Cyber Empire and Global Tech Influence

Russia’s digital offensive goes beyond bots and fake news. In late April 2024, the Kremlin’s security chief Nikolai Patrushev convened a conference in St. Petersburg with top security officials from Africa, Asia, Latin America, and the Middle East. Foreign Affairs reports that the focus was on “information sovereignty and security”—Kremlin code for exporting cybertechnologies that can shield clients from Western spying.

Moscow sells this as a defensive measure. Just protection against foreign surveillance, nothing more to see here. In practice, it means building parallel tech networks aligned with Russian standards. As one analyst noted, Russia is slowly but surely extending its cyber influence across the world, marketing cybersecurity capabilities as protection against perceived Western interference.

Major Russian cybersecurity firms have inked deals overseas. Positive Technologies signed a distribution agreement in Cairo, securing a foothold in Africa and the Middle East. Kaspersky Lab partnered with the African Union’s Smart Africa initiative to shape the continent’s cybersecurity sector. These agreements typically involve installing Russian-made telecommunications gear, surveillance cameras, network equipment, and even custom software into partner countries’ infrastructure. By embedding this kit, the Kremlin gains leverage over key systems. Once installed, Russian hardware and software could provide backdoors for Moscow’s intelligence services. This point is underscored by U.S. sanctions on firms like Positive Technologies and Kaspersky for alleged ties to the FSB. Russia's "anti-surveillance shield" is actually the opposite, a baseline for a cyberspying network. We should treat words from the Kremlin like Mimir treats words from Odin.

Domestically, Russia has already codified this approach. Its 2019 sovereign internet law forces providers to install equipment that lets the government reroute or even sever Russia’s internet from the global web. The law was officially sold as a way to defend against cyberattacks, but it effectively hands authorities full control over all national online traffic. Now the Kremlin is trying to normalize this model internationally, encouraging partners to adopt similar internet-control regimes.

Kapersky Labs, By Alexxsun - Own work, CC BY-SA 4.0, https://commons.wikimedia.org/w/index.php?curid=38045714

As a Human Rights Watch report explains, the sovereign internet framework “envisages the full transfer of control over online communication networks to a government agency … enabling the government to directly block whatever content it deems undesirable”.

In short, Russia is promoting the idea of a splintered, state-controlled internet that aligns with Moscow’s vision of security. By shifting clients onto Russian systems, Moscow simply deepens its global influence, gaining access to other nations’ networks and even training of their cyber specialists.

Key dimensions of Russia’s cyber-export strategy include:


As Russia’s tech footprint expands, so does its ability to influence information flow. Large-scale adoption of Kremlin-designed systems could facilitate cross-border surveillance or allow Russia to push its own messaging through foreign networks. Any nation running critical infrastructure on Russian hardware or networks might find itself vulnerable to hidden data collection or manipulation. By exporting the sovereign internet concept, the Kremlin also aims to weaken the global norm of an open, interconnected web.

Conclusion: A New Digital Front

Altogether, these developments paint a worrisome picture of a new digital front in global influence warfare. AI-driven bots and fake media networks are flooding information channels with pro-Kremlin content. Researchers have documented networks of automated accounts and counterfeit news sites churning out lies and conspiracies about everything from the war in Ukraine to Western politics. Dan Kennedy, a professor at Northeastern University, notes that the goal is often not to convince anyone of a specific claim, but to flood the zone with false or inflammatory stories. Truth is drowned out by a sea of disinformation. Policy analysts at RAND similarly report that Russian propagandists maintain “thousands of fake accounts on Twitter, Facebook… and VKontakte” to amplify their messages.

By overwhelming social media and news feeds, Russia seeks to destroy trust in genuine information. When ordinary people cannot tell real news from fabricated stories, confidence in governments, media, and even facts themselves is undermined. This chaos serves Russian interests by sowing division and apathy, leading to a classic firehose propaganda strategy where volume and noise do the damage.

Officials from Washington to Brussels have issued warnings about foreign disinformation ahead of upcoming elections. In the United States, experts from government and industry warned that Russia was likely the most committed and capable outside power to target the 2024 elections. In Europe, the Parliament passed resolutions highlighting recent Russian campaigns to spread pro-Kremlin narratives on social media that could interfere with EU votes. These bodies urge vigilance and steps to shore up democratic processes against hybrid threats.

With all these Russian bots flooding our internet, it's a wonder to look back and realize that Colonel Roy Campbell explained this system to Raiden all the way back in 2001.