In March 2023, a video showing a violent altercation on the Russian-Ukrainian border between a Russian-speaking family and Ukrainian soldiers went viral on social media. Is the video genuine? OSINT provides some answers

Posted by pro-Russian figures and quickly shared by their supporters, the video quickly racked up millions of views on social media and provoked anger among Internet users. The video shows soldiers insulting a mother before firing shots near a car and fleeing the scene, leaving the family in a state of shock.

However, some people voiced their suspicions: the soldiers in the video are wearing yellow armbands, whereas Ukrainian soldiers usually wear green ones.

Furthermore, dashcams, like the one that captured the footage of the altercation, have been banned in Ukraine for a year now. The ban was introduced to prevent the Russian army from getting hold of visual information that could give them a strategic advantage on the ground.

Open-source researchers from the Bellingcat and GeoConfirmed groups took a closer look at the video frame by frame. The first second of the video shows a pylon running parallel to the road.

 

A few seconds later, a second pylon perpendicular to the first suggests that the altercation had taken place at a crossroads.

By focusing on Ukraine’s separatist areas – where dashcams are not banned – and matching spatial data, such as trees and bushes, in satellite images and the video footage, the researchers were able to pinpoint, with a high degree of confidence, where the video was shot.

Volunteers went by car and on foot to photograph the identified location. These photos were then compared with the images in the video, confirming the true location of the incident: it had not taken place on the border between Russia and Ukraine, but rather in the separatist region of Donetsk, some 30 kilometers from the border.

Discovering the actual location of the altercation is telling: it is highly unlikely, if not impossible, that Ukrainian soldiers would have carried out an attack like this in this area because it has been occupied continuously by Russian forces for almost ten years. Condemned as staged, the video was quickly debunked by Internet users. The lie intended to vilify Ukrainian soldiers in the eyes of the public had been nipped in the bud.

OSINT, the best way to disprove it

The investigation, which took just a few hours to complete, was carried out by a handful of “net detectives” from the OSINT community. OSINT is a group of amateurs and professionals who are passionate about searching for and analysing information sources that are freely and legally available to the public. Their skills, which have been put to good use since the start of the war in Ukraine, have managed to stem the avalanche of Russian disinformation, often much more quickly than journalists and fact-checkers.

In times of war, Russian disinformation never just infiltrates social media as a slow trickle. Although disinformation is not exclusive to Russia, its influence operations are prolific and particularly far-reaching, increasingly targeting countries that have traditionally been non-aligned, particularly in Latin America.

By closely following pro-Russian opinion leaders on social media, such as Twitter or Telegram, OSINT analysts are able to quickly disprove a slew of spurious claims and staged events before they even reach the general public.

OSINT does much more than debunk disinformation about the war in Ukraine. It also documents equipment losses, Russian troop movements and war crimes committed against Ukrainians, such as the killing of civilians. The interactive Eyes on Russia map from the Centre for Information Resilience (CIR), an organisation that fights against human rights violations and disinformation, provides Internet users with information on the various incidents committed by Russian forces in Ukraine. Each event is linked to visual evidence uploaded to social media platforms.

AI, the new frontier of disinformation

Despite the combined efforts of journalists, fact-checkers and “OSINTers”, Russian disinformation continues to proliferate online. With artificial intelligence, this phenomenon could rapidly grow out of all proportion. These tools, which are inexpensive and accessible to all, can be used to create images, texts and videos fabricated from scratch to support misleading narratives. AI also makes it easier to spread misleading content on social media by making influence tactics more difficult for platforms to detect.

Leading the way on trends in disinformation, OSINT analysts are already sharing tips on how to more effectively identify AI-generated and AI-powered disinformation. For example, by using the search phrase “as an AI language model” on social media and search engines, analysts can identify several websites and user accounts that use chatbots, such as ChatGPT, to write their content.

By delving into the details, sharing their findings and quickly identifying misleading content on the war in Ukraine, OSINTers act as a firewall against the multitude of Russian influence operations.

Stay tuned in real time
Subscribe to
the newsletter
By providing your email address you agree to receive the Incyber newsletter and you have read our privacy policy. You can unsubscribe at any time by clicking on the unsubscribe link in all our emails.
Stay tuned in real time
Subscribe to
the newsletter
By providing your email address you agree to receive the Incyber newsletter and you have read our privacy policy. You can unsubscribe at any time by clicking on the unsubscribe link in all our emails.