Is Russian war propaganda effective and adequately countered online? “Yes” is the answer to the first part of this question, according to a report submitted to the European Commission at the end of August. “No” is the answer in the same report to the second part of the question.
These are worrying answers that should not have been made public. On August 29, 2023, the Commission stated that it had “immediately” taken down the report from its website. We have to thank our colleagues at context.com, a media website specializing in issues of political power, for their swift response: they downloaded it even more “immediately” than Brussels was able to correct its blunder, between August 24, when it went online, and the evening of August 25, when it was taken down.
Why decide to take down this report when it clearly supports the European policy of increased control of social media content? The main platforms targeted by the report—Facebook, Instagram, Telegram, TikTok, X (the new name for Twitter) and YouTube—are all signatories to the Digital Service Act (DSA), which aims to regulate their content.
They could have interpreted this document as a wake-up call, given that it highlights their failure to comply with “the standards of Article 35 [of the DSA] relating to the effective mitigation of risks […] in the case of the Kremlin’s disinformation campaigns.” The Commission is keeping a close eye on things and has already put in place tools to measure compliance with the DSA.
Five platforms taking on Russia’s hybrid warfare
Since August 25, the Big Five tech companies (Google, Apple, Facebook, Amazon, and Microsoft) have had to prove to Brussels that they are taking effective action against fake news and other illegal content. The Commission has explained that the document is not yet ready. Is this the only reason for this embarrassing take down?
To try and answer this question, inCyber has examined in detail the 74-page document entitled “Digital Services Act: Application of the Risk Management Framework to Russian disinformation campaigns.” The document analyzes trends in the number and content of social media accounts directly controlled by Moscow or in the Kremlin’s orbit, between February and November 2022.
This analysis looked at five social media platforms, four of which are considered “very large online platforms” as defined in the DSA: Facebook, Instagram, TikTok, X, and YouTube. These platforms are therefore subject to the most stringent obligations. Telegram, which the report’s authors also took a close look at, is expected to join this club soon.
Point 1: The report’s authors do not go easy on these social media platforms. “The largest social media platforms made commitments to mitigate the reach and influence of Kremlin-sponsored disinformation. Overall, these efforts were unsuccessful,” they write.
Point 2: The Russian threat must be taken extremely seriously, say the analysts appointed by the European Commission. “We find that the Kremlin’s ongoing disinformation campaign not only forms an integral part of Russia’s military agenda, but also causes risks to public security, fundamental rights and electoral processes inside the European Union,” referring to the 2024 European Parliament elections. The authors describe this threat as nothing less than “systemic.”
Point 3: The study is based on what appears to be a rigorous methodology. Its authors drew up a list of qualitative indicators (on the content of posts) and quantitative indicators (on their audience and impact on the public). They believe they can use these to measure the effect of Moscow’s propaganda, disinformation, harassment and intimidation.
What’s more, they suggest that their risk assessment and mitigation methodology can be replicated, and that it should therefore be used not only by the Commission to check that the DSA is being properly applied, but also by social media platforms to measure and implement their actions in this area.
The report’s conclusions emphasize that if social media companies had fully implemented the measures recommended by the DSA and the enhanced code of conduct before it, they would have been far more effective in their fight against Russian information warfare operations.
Pro-Russian content viewed “16 billion times”
The authors recognize, however, that the DSA is better suited to responding to behavior deemed inappropriate by individuals and that it should “be complemented with measures tailored specifically to mitigate state-backed disinformation and information operations.”
The document makes no secret of the success of accounts controlled directly by Russia, or of those which its authors consider to be in its close orbit, when they describe “a growing ecosystem of Kremlin-aligned accounts.” The report clearly shows that the audience for these pro-Russian accounts grew dramatically after Russian troops moved into Ukraine. They reportedly have “a total subscriber number of at least 165 million” across the EU and “in less than a year, their content was viewed at least 16 billion times.”
The authors deplore the fact that this success is due to Russian tactics of luring and directing Internet users to accounts not targeted by sanctions, such as those of staff members of Russian administrative bodies. There is another factor that could at least partially explain these significant numbers, which are positive from Moscow’s point of view. Some people may have checked out these accounts of their own accord, out of curiosity following Russia’s attack on Ukraine, or to hear a different side to the story from that offered by European media, which support Ukraine to varying degrees.
Bias and misjudgment
This is certainly one of the biases of this report. It’s no surprise to anyone that the European Union is fully behind Kyiv, and the report’s authors are no exception. The systematic use of certain expressions is telling. The authors always refer to Russia’s “full-scale invasion of Ukraine,” whereas at the start of the war, Moscow sent 150,000 men into battle against the 350,000 Ukrainian active soldiers, which would indicate to any serious military analyst that invading the whole of Ukraine was not Moscow’s objective. Similarly, the systematic use of the (negatively connoted) term “Kremlin” to refer to Russia also reflects this bias.
Without minimizing the consequences of the Kremlin’s propaganda and disinformation campaigns, the bias of authors can lead to certain misjudgments. For example, to explain why many Internet users turned to VKontakte, Telegram, or RuTube after the outbreak of hostilities, they offer only one explanation: the strategies for getting around Moscow-controlled accounts. They conveniently forget that these services are Russian—or at least Russian in origin—and that their Western equivalents are no longer accessible to many Internet users, as a result of retaliatory sanctions.
Are social networks the EU’s real target?
By the same token, when the report highlights the sizeable audiences of pro-Russian accounts (remember that they apparently have “a total subscriber number of at least 165 million” across Europe), they never put these figures into perspective by mentioning, for example, the number of social media subscribers in the EU. Given news coverage—both online and offline—in member states, which is very pro-Ukraine, the authors should be giving some perspective to the impact of Russian destabilization operations on social media.
It’s also worth noting that, while some account descriptions are precise, such as those relating to accounts directly or indirectly funded by Moscow, the report’s appendices provide no information on the methodology used by the authors to classify influencers and other individuals in the Kremlin’s sphere of control. The only hint we find is that they use language typically used by the Kremlin.
Without glossing over the European Commission’s concern, as the document’s sponsor, about Russia’s hybrid warfare operations, it is the social media platforms that the report is targeting. With X’s owner Elon Musk’s stated determination to pit freedom of expression against regulatory constraints, Telegram’s poor DSA compliance record (regularly singled out in the report), and the warning shots fired at Meta (Facebook’s parent company), it’s definitely the social networks that have something to worry about.