Skip to main content

Written by: Iryna Matsiuk, Lviv Polytechnic National University

1. Introduction

The “Baltic States” – Estonia, Latvia, and Lithuania – represent a frontline in the contemporary contest between liberal democracy and authoritarian influence. As members of both NATO and the European Union, they symbolize the post-Cold War expansion of Western institutions into regions historically dominated by Moscow. Given their geostrategic importance and historical experiences under Soviet rule, these nations are particularly exposed to multifaceted attempts by the Russian Federation to reassert influence. Since Russia’s full-scale invasion of Ukraine in February 2022, these hybrid threats have intensified, taking the form of increasingly sophisticated efforts to disrupt democratic processes, particularly elections.

This article explores the strategy and tactics of Russian interference in the Baltic electoral sphere, focusing on the evolution of hybrid and cognitive warfare. It analyzes how Estonia, Latvia, and Lithuania have been targeted with digital subversion, disinformation, economic coercion, and the manipulation of ethnic and religious identities. The study grounds recent developments in a historical trajectory, showing how contemporary methods mark a shift from earlier propaganda and soft power tactics to a covert, weaponized approach to information and influence. Particular attention is given to the 2023–2024 election cycles, during which interference tactics evolved significantly in response to changing geopolitical pressures.

Historical Context: From Post-Soviet Influence to Hybrid Aggression

The origins of Russia’s modern interference strategies lie in the Soviet tradition of “active measures” (активные мероприятия), which included disinformation, psychological operations, support for front organizations, and the cultivation of influence agents abroad. These methods were central to the KGB’s foreign intelligence operations during the Cold War, often aimed at undermining NATO unity and sowing discord within Western societies.

With the collapse of the USSR in 1991, the newly independent Baltic republics swiftly oriented westward, joining the EU and NATO in 2004. This realignment was perceived by the Kremlin as a geopolitical loss. In response, Russia sought to maintain influence in the region through more subtle tools of soft power, including the proliferation of Russian-language media, educational programs, and support for diaspora organizations such as the Russkiy Mir Foundation and the Gorchakov Public Diplomacy Fund.

By the late 2000s, however, the Kremlin’s toolkit evolved significantly. A pivotal moment came in 2007, when Estonia became the target of a massive wave of cyberattacks following the relocation of a Soviet-era war memorial. The attacks, attributed to Russian actors, paralyzed banks, government websites, and media outlets. This marked one of the first known uses of cyber tools in state-sponsored political coercion and underscored the vulnerabilities of digitally advanced democracies.

In the following decade, the Kremlin integrated cyber capabilities, information operations, and the use of state-backed media into a broader doctrine of “hybrid warfare.” Analysts such as Mark Galeotti (2016) argue that Russia’s use of “non-linear warfare” – combining covert, plausible-deniable actions with overt political narratives – was field-tested in the Baltic region before being exported to other theaters such as Ukraine and Western Europe. The 2014 annexation of Crimea further accelerated the refinement of these methods.

Importantly, the Kremlin’s strategy has shifted from centralized propaganda to networked disinformation. Russian operations increasingly rely on fragmented, localized channels, including fringe news sites, social media influencers, and pseudo-NGOs. The emphasis is not necessarily to convince, but to confuse and divide, as emphasized in Peter Pomerantsev’s analysis of modern Russian information tactics (2014).

Examples of this evolution include:

  • Estonia (2009): The Estonian government documented Russian intelligence efforts to recruit ethnic Russian youth for political activism and demonstrations against perceived discrimination.
  • Latvia (2011): The Harmony Centre party, though domestically rooted, received overtly favorable coverage from Russian state media, while Russian intelligence promoted narratives of cultural suppression.
  • Lithuania (2014): Fake news stories about alleged NATO soldiers committing crimes circulated widely, traced to troll farms linked to St. Petersburg-based operations (NATO StratCom 2016).

As Russia began integrating cyber, informational, economic, and military tools into a holistic doctrine of hybrid warfare, the Baltic States became a testing ground for operations that blended plausible deniability with strategic impact (Galeotti 2016). These methods foreshadowed the intensification of efforts seen after 2022, as the Kremlin recalibrated its toolkit to circumvent growing Western countermeasures.

Conceptual Framework: Hybrid and Cognitive Warfare

Hybrid warfare refers to the combination of conventional and unconventional means to achieve political objectives without triggering a full-scale military conflict. In the Russian context, this includes cyberattacks, disinformation, support for political proxies, and economic tools (Fridman 2018). A particularly insidious subset is cognitive warfare, which seeks to manipulate human perception, erode trust in institutions, and induce political paralysis by flooding the information space with competing narratives.

As articulated by Russia’s former Chief of the General Staff, Valery Gerasimov, the “Gerasimov Doctrine” emphasizes information warfare as a primary theater of modern conflict. According to Gerasimov, “non-military means of achieving political and strategic goals has grown, and, in many cases, they have exceeded the power of force of weapons in their effectiveness” (Gerasimov 2013). The phrase “Gerasimov Doctrine” is widely used but misleading; even its popularizer later clarified it is not an official doctrine but a Western shorthand for Russian thinking on non-military means (Galeotti 2018). I therefore use it as an analytic convenience, not a programmatic statement. In the Baltic context, this translates into operations targeting media ecosystems, social media platforms, minority identities, and electoral integrity.

Beyond Gerasimov’s articulation, academic theorists have further clarified the dynamics of strategic influence. For instance, securitization theory, as developed by the Copenhagen School, posits that states construct threats not simply through capabilities, but through discourse (Buzan, Wæver, and de Wilde 1998). In the Baltic context, Russian operations often rely on framing electoral or cultural developments as existential threats to Russian-speaking minorities, thereby justifying their informational interventions.

Moreover, strategic narrative theory offers another useful lens. It suggests that actors like Russia aim to impose their preferred interpretations of global order by constructing persuasive, identity-linked storylines (Miskimmon, O’Loughlin, and Roselle 2013). The Kremlin’s depiction of the West as morally decadent and itself as a defender of traditional civilization is a key example.

Importantly, hybrid and cognitive tactics often overlap. While hybrid warfare may involve physical infrastructure sabotage or cyberattacks, cognitive operations aim to shape the interpretation of such events, thereby affecting voter behavior and institutional credibility. Following the Copenhagen School, I treat interference as a sequence of speech acts that elevate routine political issues into existential threats requiring extraordinary measures (Buzan, Wæver, and de Wilde 1998). In Baltic elections, pro-Kremlin channels repeatedly reframe language policy, history curricula, or LGBT rights as threats to the survival of Russian-speaking identity; this facilitates exceptional claims (“the system is rigged,” “participation is futile”) that depress turnout and trust (VDD 2023; VSD 2024). Drawing on strategic narrative theory, Russia projects a storyworld in which it safeguards “traditional civilization” against a decadent West (Miskimmon, O’Loughlin, and Roselle 2013). In Estonia, this surfaces as claims that e-voting “silences” Russian speakers; in Latvia, as stories of cultural repression; in Lithuania, as warnings that Euro-Atlantic loyalties impoverish ordinary families. Each draws local salience from pre-existing cleavages, then scales via cross-platform amplification (RIA 2023; VDD 2023; VSD 2024). Understanding this dual-level approach is essential for interpreting Russia’s evolving interference model.

2. The Modern Phase of Interference: Tactics and Patterns Post-2022

Since Russia’s full-scale invasion of Ukraine, efforts to destabilize the Baltic States have intensified. Russian operatives and affiliated networks have localized and shortened their operations as defenses and media literacy improve. The emphasis has shifted to granular, hyperlocal influence operations, often targeting specific voter blocs with tailored disinformation.

In Estonia, the 2023 parliamentary elections saw an uptick in fake social media accounts spreading divisive narratives about the ruling Reform Party and amplifying tensions between Estonian- and Russian-speaking communities. An investigation by the Estonian Internal Security Service (KAPO) revealed that over 300 troll accounts were linked to Russian disinformation campaigns that aimed to lower voter turnout among minorities.

Latvia’s 2022 parliamentary elections coincided with the banning of several Russian-language media outlets that had previously served as soft power instruments. In response, Russian actors promoted unofficial news aggregators on Telegram and VKontakte, many masquerading as “independent” Latvian news. These channels repeatedly published content questioning the legitimacy of the elections, with a particular focus on electoral reforms that reduced the influence of Russian-speaking parties.

Lithuania, which has taken a hard line against pro-Kremlin narratives, reported a wave of phishing attacks against electoral officials in mid-2023. According to Lithuania’s Ministry of National Defence, these campaigns aimed to obtain access to voter databases and municipal email servers. Concurrently, local NGOs tracking disinformation noted the proliferation of narratives linking Lithuania’s support for Ukraine to alleged economic decline and corruption, attempting to alienate rural voters.

A common feature across all three states was the weaponization of synthetic media – AI-generated images, videos, and fake expert commentaries. One notorious case involved a deepfake video of a prominent Latvian politician allegedly accepting bribes from Western sponsors to oppose “traditional values.” Though quickly debunked, the video had already been viewed over 150,000 times before platforms intervened.

What differentiates this phase from previous interference is the speed and adaptability of tactics. Russian disinformation networks now use machine learning to refine their messaging, often testing dozens of narratives before selecting the most effective. Moreover, many operations originate from third countries such as Serbia or Kazakhstan to obscure attribution and evade sanctions.

These tactics reflect not only technological advancement but a strategic recalibration that seeks long-term erosion of public trust rather than immediate political wins.

3. Case Studies of Electoral Interference: 2023–2024

3.1 Estonia

Estonia has been hailed as a model of digital governance, including pioneering internet voting since 2005. However, this very strength also presents vulnerabilities. In the lead-up to the 2023 parliamentary elections, Estonia experienced a surge of cyber operations attributed to pro-Kremlin actors.

According to the Estonian Information System Authority (RIA), cyberattacks between February and March 2023 targeted government servers, voter databases, and the digital ID infrastructure. Though no votes were altered, the attacks generated significant media coverage and uncertainty, reducing public confidence in digital platforms.

A coordinated disinformation campaign followed, primarily through Russian-language Telegram channels and local proxies. It amplified narratives suggesting electoral fraud and discrimination against the Russian-speaking population. NATO StratCom documented a spike in false claims about the e-voting system, including allegations that it “systematically excluded Russian voices” (NATO StratCom 2023).

These efforts corresponded with increased online support for political parties that adopt a more critical stance toward Estonia’s Euro-Atlantic alignment. A notable example is the Centre Party (Keskerakond), which has historically drawn strong support from Estonia’s Russian-speaking minority due to its platform emphasizing social equity, bilingualism, and pragmatic foreign policy. Although the Centre Party explicitly condemned Russian aggression and did not endorse any disinformation narratives, pro-Kremlin media exploited its association with minority voters to indirectly amplify their messaging. Disinformation content on Telegram and fringe websites subtly framed the party as marginalized by the political mainstream, thereby reinforcing divisions and mistrust. This illustrates a key feature of cognitive warfare: exploiting legitimate political representation to create the appearance of foreign alignment, even where none exists.

3.2 Latvia

In contrast to Estonia’s emphasis on cyber vulnerabilities, Latvia’s case highlights how historical grievances can be leveraged through socio-political infiltration. Latvia’s demographic profile – with a large Russian-speaking minority concentrated in urban centers and border regions – has long made it susceptible to influence operations. The 2023 municipal elections in Daugavpils and Riga became flashpoints for alleged Russian interference.

Investigative journalism by Re:Baltica and official findings from the Latvian Security Police revealed covert support from Russian-linked NGOs for specific candidates and political movements. These groups included the Gorchakov Foundation and “Compatriots Abroad” programs, offering financial, logistical, and media support to local activists advocating “neutrality” in Latvian foreign policy (LssdSM 2023).

Moreover, Sputnik Latvia, although formally banned, continued to operate indirectly through mirror sites and social media influencers. These outlets disseminated messaging that framed the Latvian government as hostile to Russian culture and language rights. Political scientist Ivars Ijabs notes that, while these narratives did not drastically alter the electoral results, they contributed to lower trust among minority voters and increased polarization (Ijabs 2024).

The Latvian case demonstrates how foreign interference can exploit unresolved historical grievances and linguistic cleavages to deepen societal fragmentation, thereby weakening democratic cohesion over time.

3.3 Lithuania

Unlike Estonia and Latvia, Lithuania faced a subtler form of interference, with a focus on socio-economic levers. Lithuania’s 2024 presidential election featured distinct tactics that underscored the Kremlin’s adaptive approach. Unlike Estonia or Latvia, the primary mode of interference was economic and religious, with digital methods being used to a lesser extent.

In April 2024, Lithuania experienced unexpected fluctuations in its liquefied natural gas (LNG) imports. The Ministry of Energy later confirmed that supply irregularities were likely coordinated by Russian entities seeking to pressure public opinion through economic discomfort (LME 2024). As energy prices rose, opposition candidates accused the government of failing to diversify effectively, a narrative that was echoed by automated bot networks on platforms like Facebook and X.

Simultaneously, the Russian Orthodox Church intensified its influence within minority parishes. According to the State Security Department of Lithuania (VSD), sermons and religious newsletters were used to subtly promote skepticism toward “Western liberalism” and to endorse candidates perceived as more traditionally oriented. While religious freedom was not directly violated, the blurring of spiritual and political messaging raised alarm among civil society watchdogs (VSD 2024).

According to a joint annual report by Lithuania’s State Security Department (SSD) and the Second Department of Operational Services (SDOS), pro-Kremlin actors actively attempted to mobilize the country’s pro-Russian electorate, deploying an array of indirect tactics designed to delegitimize the electoral process and erode trust in democratic institutions. As reported by the Lithuanian news outlet Delfi and cited by European Pravda, Lithuanian intelligence assessed that these forces succeeded in engaging the maximum possible number of pro-Russian voters – but crucially failed to influence political outcomes or policymaking.

Lithuanian authorities noted that no direct evidence was found to suggest that the votes cast for pro-Russian candidates resulted from coordinated Russian intelligence operations. However, several pro-Russian public figures contributed indirectly to hostile information campaigns by questioning the legitimacy, transparency, and democratic character of the elections. This rhetoric aligned with – and arguably amplified – longstanding Kremlin narratives aimed at portraying the Baltic States as authoritarian, xenophobic, or discriminatory toward Russian-speaking minorities.

Lithuanian intelligence emphasized that such disinformation served the broader objectives of Russian and Belarusian propaganda, which seeks to discredit Lithuanian democracy on the international stage. The narratives in question typically claim that Lithuania is rewriting its historical memory, embracing Nazi ideology, and pursuing Russophobic policies. These talking points echo themes identified by the EUvsDisinfo project, such as the frequent portrayal of Lithuania as glorifying fascist collaborators or distorting the outcomes of World War II to fit a Western agenda (EUvsDisinfo).

These narratives also work to delegitimize Lithuania’s governance by suggesting that Russian-speaking citizens face systemic discrimination, particularly in language rights, access to public services, and political participation. While there are genuine debates within Lithuania about cultural integration, the evidence suggests that such concerns are exploited – not originated – by Russian media to inflame divisions and weaken civic cohesion.

In a particularly revealing element of the 2024 report, Lithuanian officials underscored the dual threat of these influence operations: not only are they attempting to alter voter behavior or perceptions, but they also aim to provoke a psychological state of distrust – a classic example of cognitive warfare. This tactic is consistent with the broader Russian approach described in the Gerasimov Doctrine: to achieve strategic goals through non-military means, primarily information manipulation and societal disruption.

Despite these efforts, the report concluded that Lithuanian society demonstrated high resilience. Public awareness campaigns, real-time fact-checking platforms, and a robust civil society played key roles in countering misinformation. Furthermore, the failure of pro-Russian forces to secure tangible policy influence reinforced the strength of Lithuania’s democratic institutions.

Nonetheless, Lithuanian officials remain cautious. VSD director Darius Jauniškis stated publicly in 2024 that “Russia cannot be defeated solely with information and sanctions – the only effective language is force.” His remarks reflect the growing consensus within Lithuania’s national security establishment that hybrid threats must be addressed in both civilian and military domains, especially as Moscow may recalibrate its tactics in response to its failures in conventional warfare and international diplomacy. However, while official reports emphasize state-backed origins, independent verification of some claims remains difficult due to operational secrecy.

These developments underscore a key insight: even when direct electoral manipulation fails, the Kremlin still views Baltic elections as fertile ground for testing disinformation tactics and refining cognitive warfare strategies. Lithuania’s 2024 election cycle demonstrates both the enduring threat and the growing competence of Baltic democracies in countering it.

Lithuania’s example reveals the importance of national identity as a target vector for cognitive warfare, particularly through religious and historical symbolism. This case highlights the Kremlin’s flexibility in calibrating interference methods to local conditions, exploiting both economic levers and soft cultural power to influence electoral behavior.

3.4 Shifts in Strategy: Post-2022 Evolution of Interference

The post-2022 phase marks a significant escalation in Russia’s toolkit. Rather than relying on mass media or overt propaganda, newer tactics reflect a highly targeted, multidimensional approach:

  • Cyber operations are now more selective, focusing on voter infrastructure and information repositories rather than mass disruption.
  • Disinformation campaigns utilize artificial intelligence to produce convincing fake content, such as fabricated videos of Baltic politicians allegedly criticizing NATO or the EU.
  • Economic coercion is timed to electoral cycles, aiming to provoke discontent without triggering outright confrontation.
  • Cultural and religious proxies are mobilized to influence voter attitudes without engaging in formal politics.

These changes reflect lessons learned from earlier failures in Ukraine and the West, where more overt methods were easily debunked. The shift suggests a longer-term strategy of corrosive influence rather than immediate disruption.

In all three Baltic states, Russia’s approach has become less predictable, more localized, and harder to detect. It now aims less at swinging specific elections and more at eroding long-term trust in democratic governance.

3.5 Domestic and International Countermeasures

Baltic governments have responded with a mix of technological, legislative, and societal tools:

  • Estonia has upgraded its e-voting system with blockchain-based verification and enhanced public transparency measures.
  • Latvia amended its Political Financing Law in 2023 to impose stringent transparency rules on foreign-linked NGOs and campaign donors.
  • Lithuania launched the InfoShield initiative in 2024, an inter-agency platform combining intelligence, media monitoring, and civic engagement.

At the EU level, the European Democracy Action Plan (2020) set the baseline for platform cooperation and sanctions against foreign interference, while the 2023 Defence of Democracy package proposed transparency rules for foreign-funded influence and stronger coordination across member states (European Commission 2020; European Commission 2023).

NATO’s Strategic Communications Centre of Excellence (StratCom COE), based in Riga, now plays a pivotal role in training member states in counter-disinformation strategies and electoral protection. Its partnerships with civil society actors ensure a balance between security and democratic freedoms.

Yet, challenges remain. Coordination between national and supranational actors can be inconsistent, and public fatigue over constant warnings of interference may reduce the efficacy of future alerts. Building resilience thus requires sustained civic education and ongoing innovation.

4. Beyond the Baltics: Russian Electoral Interference in Central and Eastern Europe

As attention turned to the European Parliament elections in 2024 and to Poland’s presidential contest on June 1, 2025, several non-Baltic states in Central Europe – Czechia, Slovakia, and Poland – faced sustained foreign information manipulation and interference (FIMI) linked to Russian state or proxy actors. While the channels varied from covert media financing to deceptive impersonation and deepfakes, the through-line was consistent: operations calibrated to polarize, depress trust in institutions, and shape the electoral agenda, prompting coordinated governmental and EU-level countermeasures (EEAS 2025).

In Czechia, authorities publicly exposed and sanctioned a Prague-based influence hub built around the website Voice of Europe. On March 27, 2024 the Czech government designated Viktor Medvedchuk – an influential pro-Kremlin Ukrainian politician – and his associate Artem Marchevskyi, alongside the company operating voiceofeurope.com, after Czech intelligence concluded the platform was being used to channel pro-Kremlin narratives and covertly support selected politicians ahead of the EU elections (Reuters 2024a; MFA Czechia 2024). The EU then extended these measures Union-wide on May 27, 2024, imposing travel bans and asset freezes on Medvedchuk, Marchevskyi, and the entity behind Voice of Europe (Council of the EU 2024a; Reuters 2024b). Czech intelligence reporting for 2023 similarly described Russian-origin influence activity via front media, intermediaries, and funding conduits – as a continuing priority threat to Czech democratic processes (BIS 2024). The actions in Prague illustrated a now-familiar model: a hybrid blend of PR-style platforming for sympathetic voices, information-laundering across “pseudo-media,” and opaque financial inducements, followed once exposed by national listings and coordinated EU sanctions (Council of the EU 2024b).

Slovakia offers a different but closely related pattern: a volatile information environment in which last-minute deceptive content can move quickly. Two days before the 30 September 2023 parliamentary vote, a fabricated audio (a voice-clone “deepfake”) purporting to feature a leading opposition figure circulated widely on social media, illustrating how low-cost synthetic media can seed narratives at precisely the moment fact-checking capacity is thin and the campaign silence period begins (Meaker 2023; OSCE/ODIHR 2023a). OSCE/ODIHR’s election reports for 2023 and 2024 noted a polarized media sphere, the rapid spread of misleading content, and challenges in countering online manipulation under tight timelines (OSCE/ODIHR 2023a; 2025). The Slovak angle also intersects with the Czech Voice of Europe exposure: in May 2024, reporting indicated Slovakia granted temporary protection status to Marchevskyi, the platform’s operational figure, even as Czech measures remained in force an example of how cross-border legal and procedural asymmetries can complicate mitigation (Reuters 2024c). Taken together, Slovakia’s case highlights both the tactical innovation (use of AI-assisted deception) and the structural vulnerabilities (fragmented information oversight and cross-jurisdictional gaps) that FIMI actors can exploit.

In Poland, a frontline state in support to Ukraine and a pivotal EU member, Russian-linked operations concentrated on impersonation, translation-based laundering, and sustained narrative campaigns. EU and EEAS analyses have documented the long-running “Doppelgänger” ecosystem networks of cloned news sites and coordinated inauthentic accounts that mimic reputable outlets to push tailored disinformation in multiple EU languages, Poland included, especially around the June 2024 European elections (EUvsDisinfo 2024a; EEAS 2025). On December 16, 2024 the Council adopted measures targeting hybrid campaigns threatening EU states, including sanctions addressing infrastructures used by operations such as Doppelgänger – an acknowledgement that stable, reusable technical assets (domains, hosting, content farms) are the backbone of repeat interference attempts (Council of the EU 2024b). As Poland approached the June 1, 2025 presidential vote, EEAS/EUvsDisinfo flagged a coordinated FIMI push on April 9, 2025 that deployed deceptive video impersonations of Polish outlets (TVP World; Gazeta Wyborcza), falsely alleging security services sought to postpone the election – content designed to erode confidence in electoral administration (EUvsDisinfo 2025). Polish authorities and national expertise centers also mobilized: NASK (the national research institute) publicized pre-election interventions and rapid-response workflows for disinformation incidents, complementing platform takedowns and fact-checking networks (NASK 2025). These responses unfolded against an intensely watched contest the National Electoral Commission (PKW) subsequently posted official results highlighting how, regardless of outcome, the immediate target of such operations is often the process itself: turnout, trust, and acceptance of results (PKW 2025).

Across these three cases, several operational constants emerge. First, contested provenance from front media with opaque funding (Czechia) to impersonation videos (Poland) and AI-assisted audio (Slovakia) remains the essential tactic for blending manipulative content into domestic information streams. Second, timing is strategic: assets activate in the short windows when fact-checking and moderation lag have the greatest payoff (Meaker 2023; EUvsDisinfo 2025). Third, institutional learning has accelerated. Czech intelligence disclosures and sanctions, OSCE/ODIHR’s cycle of election-reporting in Slovakia, and Poland’s NASK-led pre-election posture all fed into a broader EU response: blacklisting the Voice of Europe network, hardening legal tools against hybrid campaigns, and publishing iterative, data-driven FIMI threat assessments (Council of the EU 2024a; Council of the EU 2024b; EEAS 2025).

5. Conclusion: Defending Democracy in the Cognitive Era

The Kremlin’s hybrid interference in the Baltic States underscores a central paradox of modern democracy: its openness and pluralism, while sources of strength, also provide vectors for exploitation. Russian influence operations have evolved from blunt propaganda to nuanced, integrated campaigns that blur the line between internal dissent and foreign manipulation.

Russia’s intensified activity in the Baltic region after its invasion of Ukraine in 2022 reflects both strategic necessity and geopolitical reprioritization. Facing unprecedented sanctions, international isolation, and a faltering military campaign, the Kremlin has sought to destabilize neighboring NATO and EU member states through non-military means. The Baltic States – historically part of the Soviet sphere and home to sizable Russian-speaking minorities – have become key targets in this campaign. Their vocal support for Ukraine, leadership in advocating tougher EU sanctions, and provision of military aid to Kyiv have placed them squarely in Moscow’s ideological and strategic crosshairs. According to Lithuania’s State Security Department, the Kremlin increasingly views the region not merely as a buffer zone but as a battleground for influence within Europe itself. Hybrid tactics – including cyberattacks, disinformation, and influence operations – allow Russia to project power and retaliate without direct confrontation. The Baltic states’ geographical proximity and historical memory make them especially susceptible to information warfare, which Russia uses to foment internal divisions, question the legitimacy of their political institutions, and erode public trust. This escalation is not merely tactical but also symbolic: it is intended to demonstrate that no amount of Euro-Atlantic integration can insulate a post-Soviet state from Russian reach.

Yet, the Baltic response – grounded in institutional innovation, civic resilience, and international solidarity – offers a model for democratic adaptation. As the EU and NATO recalibrate their defenses for the cognitive era, the Baltic experience should inform wider strategies to protect electoral integrity across Europe and beyond.

Ultimately, this contest is not merely about elections but about the future of governance in a digitally interconnected world. Safeguarding democracy requires sustained investment in media literacy, electoral transparency, and a vigilant, informed citizenry.

The evidence from 2023-2024 suggests that institutional verifiability, regulatory hygiene in the information space, and civic literacy rather than content takedowns alone are the most reliable predictors of electoral resilience in small democracies targeted by modular trust-corrosion.

Bibliography

  1. BIS (Security Information Service, Czechia). 2024. Annual Report 2023. Prague.
  2. Buzan, Barry, Ole Wæver, and Jaap de Wilde. Security: A New Framework for Analysis. Boulder, CO: Lynne Rienner, 1998.
  3. Council of the European Union. 2024a. “Foreign Information Manipulation and Interference: EU Imposes Sanctions against the ‘Voice of Europe’ Network.” Press release, May 27, 2024.
  4. Council of the European Union. 2024b. “EU Imposes Measures Addressing Hybrid Campaigns Threatening the EU or Its Member States.” Press release, December 16, 2024.
  5. EEAS (European External Action Service). 2025. 3rd Report on Foreign Information Manipulation and Interference (FIMI), March 2025.
  6. European Commission. Defence of Democracy Package (Proposal on Transparency of Interest Representation). Brussels: European Commission, 2023.
  7. European Commission. European Democracy Action Plan. Brussels: European Commission, 2020.
  8. EUvsDisinfo (EEAS). 2024a. “Doppelganger Strikes Back: Unveiling FIMI Activities Targeting European Parliament Elections,” June 19, 2024.
  9. EUvsDisinfo (EEAS). 2025. “Missiles and Lies. Again.” April 2025 (on April 9 coordinated impersonation videos targeting Poland’s 2025 presidential election).
  10. Fridman, Ofer. Russian “Hybrid Warfare”: Resurgence and Politicization. Oxford: Oxford University Press, 2018.
  11. Galeotti, Mark. Hybrid War or Gibridnaya Voina? Getting Russia’s Non-Linear Military Challenge Right. Rome: NATO Defense College, 2016.
  12. Gerasimov, Valery. “The Value of Science is in the Foresight.” Military-Industrial Courier, February 2013.
  13. GLOBSEC. Vulnerability Index 2024: Central & Eastern Europe. Bratislava: GLOBSEC, 2024.
  14. Ijabs, Ivars. “Electoral Behavior and Influence in Latvia’s Russian-Speaking Regions.” Baltic Journal of Political Science, no. 4 (2024): 44–68.
  15. KAPO (Estonian Internal Security Service). Annual Review 2024–2025. Tallinn: KAPO, 2025.
  16. Latvian Public Broadcasting (LSM). “Latvian Security Police Report on Foreign Interference.” Riga: LSM, September 2023.
  17. Lithuanian Ministry of Energy (LME). “Energy Supply Disruptions and National Security Risks.” Vilnius: LME Report, April 2024.
  18. Meaker, Morgan. 2023. “A Deepfake Audio Recording Derailed an Election in Slovakia.” WIRED, October 3, 2023.
  19. MFA of the Czech Republic. 2024. “EU Puts Voice of Europe and Two Other Entities on Sanctions List as a Result of Czech Proposal,” May 27, 2024.
  20. Miskimmon, Alister, Ben O’Loughlin, and Laura Roselle. Strategic Narratives: Communication Power and the New World Order. New York: Routledge, 2013.
  21. NASK (Naukowa i Akademicka Sieć Komputerowa). 2025. “Dezinformacja przed wyborami. NASK reaguje.” Warsaw, 2025.
  22. NATO Strategic Communications Centre of Excellence (StratCom). Hybrid Threat Monitoring Report: Baltic Elections 2023. Riga: NATO StratCom, 2023.
  23. OSCE/ODIHR. 2023a. Slovakia—Early Parliamentary Elections, 30 September 2023: Limited Election Observation Mission Final Report. Warsaw, 2023.
  24. OSCE/ODIHR. 2025. Slovakia—Presidential Election, 23 March and 6 April 2024: Election Observation Mission Final Report. Warsaw, 2025.
  25. PKW (Państwowa Komisja Wyborcza). 2025. “Wybory Prezydenta Rzeczypospolitej Polskiej 2025—Wyniki głosowania w ponownym głosowaniu.” Warsaw, June 2025.
  26. Pomerantsev, Peter. Nothing Is True and Everything Is Possible: The Surreal Heart of the New Russia. New York: PublicAffairs, 2014.
  27. Re:Baltica. “Foreign Influence in Local Elections: The Hidden Hand.” Riga: Re:Baltica Investigative Series, November 2023.
  28. Reuters. 2024a. “Czechs Sanction Medvedchuk, Website over Pro-Russian EU Political Influence,” March 27, 2024.
  29. Reuters. 2024b. “EU Imposes Sanctions on Voice of Europe, Businessmen,” May 27, 2024.
  30. Reuters. 2024c. “Slovakia Gives Protection to Man Accused of Running Pro-Russian Influence Campaign,” May 1, 2024.
  31. RIA (Estonian Information System Authority). “Cybersecurity Report on 2023 Elections.” Tallinn: RIA, March 2023.
  32. RIA (Information System Authority of Estonia). Cyber Security in Estonia 2023. Tallinn: RIA, 2023.
  33. RIA. “Cyber Threats Increased in Q1 2023, with Effects on Elections.” Tallinn: RIA, 2023.
  34. SSD (State Security Department of Lithuania). “Annual Threat Assessment 2024.” Vilnius: SSD, May 2024.
  35. VDD (Latvian State Security Service). Annual Report 2023. Riga: VDD, 2024.
  36. VSD/AOTD (Lithuania). National Threat Assessment 2024. Vilnius: State Security Department (VSD) and Second Department of Operational Services, 2024.

 

Guest Author

Blue Europe's guest authors contribute specialised insights on Central and Eastern European affairs. These authors, whether invited or unsolicited, include experts from academia, politics, journalism and independent research. While individual backgrounds may vary, each contribution is selected for its analytical rigour and relevance to the Think Tank's vision of promoting European integration and understanding. Their work supports Blue Europe's mission to provide high quality and impactful analysis on critical issues facing our continent.

×