Cyberveillecurated by Decio
Nuage de tags
Mur d'images
Quotidien
Flux RSS
  • Flux RSS
  • Daily Feed
  • Weekly Feed
  • Monthly Feed
Filtres

Liens par page

  • 20 links
  • 50 links
  • 100 links

Filtres

Untagged links
Kremlin Propagandists Weaponize OpenAI's Video Generator https://www.newsguardrealitycheck.com/p/kremlin-propagandists-weaponize-openais
18/11/2025 11:44:35
QRCode
archive.org

NewsGuard's Reality Check
newsguardrealitycheck.com
Nov 17, 2025

What happened: In an effort to discredit the Ukrainian Armed Forces and undermine their morale at a critical juncture of the Russia-Ukraine war, Kremlin propagandists are weaponizing OpenAI’s new Sora 2 text-to-video tool to create fake, viral videos showing Ukrainian soldiers surrendering in tears.

Context: In a recent report, NewsGuard found that OpenAI’s new video generator tool Sora 2, which creates 10-second videos based on the user’s written prompt, advanced provably false claims on topics in the news 80 percent of the time when prompted to do so, demonstrating how the new and powerful technology could be easily weaponized by foreign malign actors.

A closer look: Indeed, so far in November 2025, NewsGuard has identified seven AI-generated videos presented as footage from the front lines in Pokrovsk, a key eastern Ukrainian city that experts expect to soon fall to Russia.

The videos, which received millions of views on X, TikTok, Facebook, and Telegram, showed scenes of Ukrainian soldiers surrendering en masse and begging Russia for forgiveness.

Here’s one video supposedly showing Ukrainian soldiers surrendering:

And a video purporting to show Ukrainian soldiers begging for forgiveness:

Actually: There is no evidence of mass Ukrainian surrenders in or around Pokrovsk.

The videos contain multiple inconsistencies, including gear and uniforms that do not match those used by the Ukrainian Armed Forces, unnatural faces, and mispronunciations of the names of Ukrainian cities. NewsGuard tested the videos with AI detector Hive, which found with 100 percent certainty that all seven were created with Sora 2. The videos either had the small Sora watermark or a blurry patch in the location where the watermark had been removed. Users shared both types as if they were authentic.

The AI-generated videos were shared by anonymous accounts that NewsGuard has found to regularly spread pro-Kremlin propaganda.

Ukraine’s Center for Countering Disinformation said in a Telegram post that the accounts “show signs of a coordinated network specifically created to promote Kremlin narratives among foreign audiences.”

In response to NewsGuard’s Nov. 12, 2025, emailed request for comment on the videos, OpenAI spokesperson Oscar Haines said “we’ll investigate” and asked for an extension to Nov. 13, 2025, to provide comment, which NewsGuard provided. However, Haines did not respond to follow-up inquiries.

This is not the first time Kremlin propagandists have weaponized OpenAI’s tools for propaganda. In April 2025, NewsGuard found that pro-Kremlin sources used OpenAI’s image generator to create images of action figure dolls depicting Ukrainian President Volodymyr Zelensky as a drug addict and corrupt warmonger.

newsguardrealitycheck.com EN 2025 Russia Sora2 fake disinformation AI-generated videos
4892 links
Shaarli - Le gestionnaire de marque-pages personnel, minimaliste, et sans base de données par la communauté Shaarli - Theme by kalvn