PUBLISHED ON 18 SEP 2025
recordedfuture.com
Insikt Group®
Executive Summary
Since March 2025, Insikt Group has observed CopyCop (also known as Storm-1516), a Russian covert influence network, creating at least 200 new fictional media websites targeting the United States (US), France, and Canada, in addition to websites impersonating media brands and political parties and movements in France, Canada, and Armenia. CopyCop has also established a regionalized network of websites posing as a fictional fact-checking organization publishing content in Turkish, Ukrainian, and Swahili, languages never featured by the network before. Including the 94 websites targeting Germany reported by Insikt Group in February 2025, this amounts to over 300 websites established by CopyCop’s operators in the year to date, marking a significant expansion from our initial reporting on the network in 2024, and with many yet to be publicly documented.
These websites are very likely operated by John Mark Dougan with support from the Moscow-based Center for Geopolitical Expertise (CGE) and the Main Directorate of the General Staff of the Armed Forces of the Russian Federation (GRU). CopyCop uses these websites as infrastructure to disseminate influence content targeting pro-Western leadership and publish artificial intelligence (AI)-generated content with pro-Russian and anti-Ukrainian themes in support of Russia’s offensive operations in the global information environment.
While the network’s scope in terms of target languages and countries has expanded, its primary objectives almost certainly remain unchanged: undermining support for Ukraine and exacerbating political fragmentation in Western countries backing Ukraine. Insikt Group has also observed CopyCop engaging in additional secondary objectives like advancing Russia’s geopolitical objectives in its broader sphere of influence, such as Armenia and Moldova. CopyCop’s narratives and content in support of these objectives are routinely amplified by an ecosystem of social media influencers in addition to other Russian influence networks like Portal Kombat and InfoDefense.
Similar to its objectives, CopyCop’s tactics, techniques, and procedures (TTPs) remain broadly unchanged, with marginal improvements designed to strengthen the network’s reach, resilience, and credibility. Tactics and techniques used for content dissemination typically include deepfakes, lengthy dossiers intending to embarrass targets, and fake interviews of alleged whistleblowers making claims about political leaders in NATO member states like the US, France, and Germany. Insikt Group also identified new evidence that CopyCop uses self-hosted, uncensored large language models (LLMs) based on Meta’s Llama 3 open-source models to generate AI content rather than relying on Western AI service providers.
Relative to other Russian influence networks, CopyCop’s impact remains significant: targeted influence content promoted by its websites and an ecosystem of pro-Russian social media influencers and so-called “journalists” regularly obtains high rates of organic engagement across multiple social media platforms, and has a precedent for breaking into mainstream political discourse. Persistently identifying and publicly exposing these networks should remain a priority for governments, journalists, and researchers seeking to defend democratic institutions from Russian influence.
Key Findings
To date, in 2025, CopyCop has widened its target languages to include Turkish, Ukrainian, and Swahili, and its geographic scope to include Moldova, Canada, and Armenia while sustaining influence operations targeting the US and France. The network is also leveraging new infrastructure to publish content, marking a significant expansion of its activities targeting new audiences.
CopyCop’s core influence objectives remain eroding public support for Ukraine and undermining democratic processes and political leaders in Western countries supporting Ukraine.
CopyCop’s TTPs are broadly unchanged from previous assessments, with only marginal improvements to increase the network’s reach, resilience, and credibility. Newly observed TTPs include evidence of CopyCop using self-hosted LLMs for content generation, employing subdomains as mirrors, and impersonating media outlets.
Insikt Group has identified two uncensored versions of Meta’s Llama-3-8b model that are likely being used by CopyCop to generate articles.
The network is also increasingly conducting influence operations within Russia’s sphere of influence, including targeting Moldova and Armenia ahead of their parliamentary elections in 2025 and 2026, respectively. This is a broader trend observed across the Russian influence ecosystem.
Background
Insikt Group previously documented CopyCop in May and June 2024, in addition to the network’s attempts at influencing the 2024 French snap elections, 2024 US presidential elections, and 2025 German federal elections. Reporting from other organizations such as Clemson University, VIGINUM, NewsGuard, Microsoft, European External Action Service, and Gnida Project has broadly corroborated our initial assessments of the network’s objectives, targets, and infrastructure, in addition to our attribution of part of the network’s activities to John Mark Dougan, a US citizen based in Moscow. The Washington Post and the US Department of the Treasury have also since established links between Dougan, the CGE, and the GRU. The GRU reportedly helped fund self-hosted LLM infrastructure, while the CGE was likely responsible, with Dougan’s assistance and direction from the GRU, for the creation of deepfakes and inauthentic content targeting political leaders in the US, Ukraine, France, and other countries.
nytimes.com - Documents examined by researchers show how one company in China has collected data on members of Congress and other influential Americans.
The Chinese government is using companies with expertise in artificial intelligence to monitor and manipulate public opinion, giving it a new weapon in information warfare, according to current and former U.S. officials and documents unearthed by researchers.
One company’s internal documents show how it has undertaken influence campaigns in Hong Kong and Taiwan, and collected data on members of Congress and other influential Americans.
While the firm has not mounted a campaign in the United States, American spy agencies have monitored its activity for signs that it might try to influence American elections or political debates, former U.S. officials said.
Artificial intelligence is increasingly the new frontier of espionage and malign influence operations, allowing intelligence services to conduct campaigns far faster, more efficiently and on a larger scale than ever before.
The Chinese government has long struggled to mount information operations targeting other countries, lacking the aggressiveness or effectiveness of Russian intelligence agencies. But U.S. officials and experts say that advances in A.I. could help China overcome its weaknesses.
A new technology can track public debates of interest to the Chinese government, offering the ability to monitor individuals and their arguments as well as broader public sentiment. The technology also has the promise of mass-producing propaganda that can counter shifts in public opinion at home and overseas.
China’s emerging capabilities come as the U.S. government pulls back efforts to counter foreign malign influence campaigns.
U.S. spy agencies still collect information about foreign manipulation, but the Trump administration has dismantled the teams at the State Department, the F.B.I. and the Cybersecurity and Infrastructure Security Agency that warned the public about potential threats. In the last presidential election, the campaigns included Russian videos denigrating Vice President Kamala Harris and falsely claiming that ballots had been destroyed.
The new technology allows the Chinese company GoLaxy to go beyond the election influence campaigns undertaken by Russia in recent years, according to the documents.
In a statement, GoLaxy denied that it was creating any sort of “bot network or psychological profiling tour” or that it had done any work related to Hong Kong or other elections. It called the information presented by The New York Times about the company “misinformation.”
“GoLaxy’s products are mainly based on open-source data, without specially collecting data targeting U.S. officials,” the firm said.
After being contacted by The Times, GoLaxy began altering its website, removing references to its national security work on behalf of the Chinese government.
The documents examined by researchers appear to have been leaked by a disgruntled employee upset about wages and working conditions at the company. While most of the documents are not dated, the majority of those that include dates are from 2020, 2022 and 2023. They were obtained by Vanderbilt University’s Institute of National Security, a nonpartisan research and educational center that studies cybersecurity, intelligence and other critical challenges.
Publicly, GoLaxy advertises itself as a firm that gathers data and analyzes public sentiment for Chinese companies and the government. But in the documents, which were reviewed by The Times, the company privately claims that it can use a new technology to reshape and influence public opinion on behalf of the Chinese government.
On 12 June 2025, dozens of anonymous X (formerly Twitter) accounts advocating Scottish independence abruptly went silent. Many had posted hundreds of times per week, often using pro-independence slogans, anti-UK messaging, and identity cues like “NHS nurse” or “Glaswegian socialist.”
Their sudden disappearance coincided with a major Israeli airstrike campaign against Iranian military and cyber infrastructure. Within days, Iran had suffered severe power outages, fuel shortages, and an internet blackout affecting 95 percent of national connectivity.
What appeared at first glance to be a curious coincidence has since emerged as the most visible rupture to date in a long-running foreign influence operation.
The U.S. intelligence community on Monday said Russia is responsible for recent videos shared on social media that sought to denigrate Vice President Kamala Harris, including one that tried to implicate her in a hit-and-run accident.
Spy agencies also assess that Russian influence actors were responsible for altering videos of the vice president's speeches — behavior consistent with Moscow’s broader efforts to boost former President Donald Trump’s candidacy and disparage Harris and the Democratic Party, an official with the Office of the Director of National Intelligence said during a press briefing.
Russia is increasingly turning to American social media stars to covertly influence voters ahead of the 2024 presidential election, according to U.S. officials and recently unveiled criminal charges.
“What we see them doing is relying on witting and unwitting Americans to seed, promote and add credibility to narratives that serve these foreign actors’ interest,” a senior intelligence official said in a briefing on Friday. “These foreign countries typically calculate that Americans are more likely to believe other Americans’ views.”
Lors de sa séance du 19 juin 2024, le Conseil fédéral a approuvé le rapport établi en réponse au postulat 22.3006 de la Commission de la politique de sécurité du Conseil national «État des lieux relatif à la menace que constituent pour la Suisse les campagnes de désinformation». Le rapport montre l’impact sur le pays des activités d’influence dans l’espace de l’information, les éléments caractéristiques pertinents dans ce contexte et les mesures supplémentaires que le Conseil fédéral entend prendre pour contrer ces menaces.
Grâce à l’usage du Big Data et des algorithmes dans les campagnes électorales et de votation, il devient possible d’influencer le comportement des électeurs et le résultat d’un suffrage. Cela soulève la question du droit à l’autodétermination des individus mais aussi des peuples.
The pages promote Russia’s line on the war in Ukraine to more than 4 million followers, casting doubt on Meta’s pledge to combat foreign influence campaigns.
Dans du projet « Story Killers » qui poursuit le travail de la journaliste indienne Gauri Lankesh sur la désinformation, le consortium Forbidden Stories révèle aujourd’hui l’existence d’une entreprise israélienne ultra-secrète impliquée dans la manipulation d’élections à grande échelle et le piratage de responsables politiques africains. Une plongée inédite au cœur d’un monde où s’entremêlent armée de trolls, cyber espionnage et jeux d’influence. Story Killers, une enquête mondiale sur les mercenaires de la désinformation, que Mondafrique a le fierté de publier. Cécile Andrzejewski « Les choses n’ont pas forcément besoin d’être vraies, du moment qu’elles sont crues. » Voilà une citation qui
Mandiant has recently observed DRAGONBRIDGE, an influence campaign we assess with high confidence to be operating in support of the political interests of the People’s Republic of China (PRC), aggressively targeting the United States by seeking to sow division both between the U.S. and its allies and within the U.S. political system itself. Recent narratives include:
“Polizei!” barked the officers who stormed a third-floor apartment in the Austrian capital, moving to intercept a thickset man standing near a kitchen nook. The suspect — a long-serving official in Austria’s security services — sprang toward his cellphone and tried to break it in two, according to Austrian police reports.
This report represents research conducted by Microsoft’s threat intelligence and data science teams with the goal of sharpening our understanding of the threat landscape in the ongoing war in Ukraine. The report also offers a series of lessons and conclusions resulting from the data gathered and analyzed. Notably, the report reveals new information about Russian efforts including an increase in network penetration and espionage activities amongst allied governments, non-profits and other organizations outside Ukraine. This report also unveils detail about sophisticated and widespread Russian foreign influence operations being used among other things, to undermine Western unity and bolster their war efforts. We are seeing these foreign influence operations enacted in force in a coordinated fashion along with the full range of cyber destructive and espionage campaigns. Finally, the report calls for a coordinated and comprehensive strategy to strengthen collective defenses – a task that will require the private sector, public sector, nonprofits and civil society to come together. The foreword of this new report, written by Microsoft President and Vice Chair Brad Smith, offers additional detail below.