Election Interference Through Social Media Coordinated Campaigns: A Practical Guide What Comes Ahead Based on the Romanian Case
January 24, 2025
The Romanian elections highlighted the immense power of social media as a tool for political mobilization. Disillusioned voters, frustrated with corrupt and clientelistic governments, were swayed by a promising and charismatic figure, Calin Georgescu. This extremist candidate capitalized on social media to project himself as a messianic figure, a pattern increasingly familiar across Europe.
In Romania, the anti-system vote was a driving force, yet the surprising result was that support did not consolidate around the far-right party leader George Simion, despite his massive online and offline following and a rising political party. Instead, Calin Georgescu, a relatively obscure figure, captured the far-right vote.
Georgescu’s mainstream breakthrough was driven by a carefully crafted TikTok campaign. In just 23 days before the election, views on his primary hashtag #calingeorgescu surged from 6 million to 140 million. Expert Forum’s research revealed that this viral surge was fueled by inauthentic, coordinated accounts—over 1,000 fan pages were converted into exclusive promotional tools for Georgescu.
There is a clear vulnerability of social media platforms to influence operations: they do not proactively follow them unless they are under pressure to do so. Now we have this digital constitution of the EU called the Digital Services Act where we have the definition of a “systemic risk,” that relates to the spread of illegal content, or content that might have negative effects on the exercise of fundamental rights or on civic discourse, electoral processes, among others.
The DSA requires platforms such as Meta and TikTok to carry out risk assessments to detail whether they are addressing such risks on their platforms. The problem is that this provision didn’t happen in the case of Romania, and it will probably take some time and political pressure to work on the enforcement of the digital legislation elsewhere in the EU.
We need to prepare the online space before the elections. Currently, there is no clear, effective strategy for managing social media during electoral campaigns. Although the Digital Services Act (DSA) mandates that Very Large Online Platforms (VLOPs), such as Meta and TikTok, provide a national contact point for authorities, their responsiveness remains weak and largely superficial.
In monitoring influence operations, we need to follow the dissemination through coordinated networks, not the official political accounts of politicians. We need to advocate that political content also means fan base accounts - like sports, and entertainment pages that are being temporarily converted into political pages during an electoral campaign. This is one of the main strategies that undercover political campaigns use online, converting already existing pages into political ones.
The upcoming Transparency in Targeting of Political Advertising (TTPA), set to take effect in October 2025, aims to introduce stricter rules on political advertising. However, reactions to this regulation are mixed. For example, Google has announced its intention to ban political advertising on its platform altogether, raising concerns about how enforcement will balance effective regulation with the protection of freedom of speech.
More must be done to ensure a fair and equal playing field in the digital space during elections. Unfortunately, there is no quick solution. The European Commission, which is primarily responsible for regulating Very Large Online Platforms (VLOPs) like Meta and TikTok, along with national authorities known as Digital Service Coordinators (DSCs), must develop a comprehensive and proactive strategy before elections. This strategy should include continuous monitoring of the online information environment, with a strong focus on identifying and addressing political content manipulation, influence operations, inauthentic behavior, and undisclosed paid advertisements.
A promising approach to effectively coordinate efforts is the formation of joint working groups through advisory boards established under the Digital Services Act (DSA). Germany has already implemented this model by creating an independent advisory body composed of experts from civil society, the private sector, and academia. This group advises the national authority responsible for enforcing the DSA—in Germany’s case, the Bundesnetzagentur.
The key lesson and most uncomfortable from Romania’s case is the importance of a proportional response from authorities—one that protects fundamental rights and avoids abusive laws, especially when traditional political parties see they start losing power.
The annulment of the elections is a highly controversial decision, and rightfully so, as it risks fueling long-term radicalization. Meanwhile, there has been little proactive monitoring or mobilization since the November elections. On one hand, the government is failing in proving the coordinated electoral interference, while it passes legislation that will restrict the right to vote to Romanians in the diaspora and can potentially limit the right to free speech. Censorship cannot be the solution, nor can political parties shift blame onto citizens for their voting choices.
Mădălina Voinea is the coordinator of the anti-disinformation programm at the Romanian think tank ‘Expert Forum’. Her key topics are digital surveillance, international relations and political analysis. Her main focus is on the detection of disinformation targeting the Black Sea region, in particular narratives aimed at discrediting Ukraine.
This Article was published first in German as part of our project “Monitoring disinformation and the far right ahead of the German elections 2025” on the website btw2025.cemas.io. Please visit the page for more insights on the topic.