top of page

Beyond Espionage: How Foreign State Disinformation Fuels Radicalisation — and How to Counter It.

11/09/2025

Beyond Espionage: How Foreign State Disinformation Fuels Radicalisation — and How to Counter It.

 

Deterrence Center Publication

 

The UK, like many democracies, is a frequent target of digital interference campaigns. By studying these openly, we strengthen resilience - not only nationally, but as part of a shared democratic effort.

 

With the rise of globalisation and technology, foreign state interference has evolved far beyond traditional forms such as covert intelligence operations, political manipulation, electoral meddling, and conventional propaganda. The scale, speed, and subtlety of information warfare today surpass older methods like state-sponsored newspapers, pamphlets, cultural front organisations, or shortwave radio broadcasts. While these tactics still exist, the advent of digital platforms has opened an entirely new frontier. Foreign actors now operate at the intersection of information warfare and social engineering. They leverage digital ecosystems to exploit societal divisions, erode trust in democratic institutions, and influence or radicalise individuals. This represents a complex, transnational threat embedded in the digital infrastructure of everyday life.


While the manipulation of information for political ends is not new, what is new is the weaponisation of everyday digital platforms - used not just to communicate propaganda, but to embed it seamlessly into the social fabric. Tools originally designed for connection have become operational environments for influence. Through subtle algorithmic design and content targeting, disinformation may be embedded into public discourse without users ever realising its engineered origins.


Disinformation refers to false or misleading information that is deliberately created and disseminated to deceive or manipulate. In contrast, misinformation involves the unintentional spread of falsehoods by individuals who may believe the content to be true (Wardle & Derakhshan, 2017).


These two concepts are deeply interconnected: state-sponsored disinformation often evolves into misinformation when picked up and redistributed by the public - intensifying its reach and normalising its narratives.


What distinguishes this modern interference is not simply its scale or technical sophistication, but its psychological precision. State-designed disinformation campaigns are not merely lies. They are strategic narratives, engineered to manipulate emotions, deepen grievances, and destabilise societies from within.


This is a domain where state-sponsored disinformation becomes misinformation in the hands of the public, offering an insight into national anxieties, trust deficits, and polarisation trends. Disinformation becomes a dual-use weapon: it both deceives and reveals. Its spread provides intelligence about societal dynamics and fault lines.


The Strategic Logic of Interference

Foreign state actors use digital disinformation as an asymmetric tool to overcome conventional geopolitical constraints. Online, they exploit the openness of democratic systems to shape domestic narratives, undermine adversaries’ credibility, and influence youth radicalisation.


Unlike kinetic warfare, this form of interference requires no physical presence. It is low-cost, high-impact, deniable - and operates in a legal grey zone. These operations fall far below the threshold of casus belli; they are non-kinetic (non-physical), and the collective outcomes of disinformation campaigns are often difficult to attribute beyond doubt or to accurately gauge foreign actor involvement in causing the effects. By embedding divisive content within social platforms, adversaries circumvent traditional defences, targeting belief systems, identity politics, and public trust in institutions.


The Mechanics of Radicalisation Through Foreign Interference

State and non-state actors alike now deploy carefully tailored content across digital platforms. Their tools include deepfakes, manipulated leaks, targeted ads, and bot-driven amplification campaigns. But the goal is more insidious than simple disruption: it is to entrench grievance, legitimise extremism, and destabilise public discourse until radicalisation risks becoming ambient.


Foreign state disinformation intersects with radicalisation at multiple points:

·       Exploitation of grievance: Amplifying real or perceived injustices - racial, economic, religious, political—disinformation draws individuals toward extreme narratives.


·       Algorithmic echo chambers: Once exposed to provocative or conspiratorial content, individuals are often served increasingly radical materials through engagement-optimised algorithms.


·       Identity engineering: Foreign actors create persuasive personas or movements, offering disaffected individuals a sense of belonging, purpose and community.


·       Delegitimisation of moderates: Trusted figures - journalists, scientists, teachers, mainstream political leaders—are discredited as part of the ‘system,’ driving people toward fringe voices.


What makes these operations powerful is their camouflage: simultaneously appearing as grassroots sentiment and functioning as top-down psychological operations. This duality complicates attribution: distinguishing between organic radicalisation and foreign manipulation requires tracing amplification patterns, account origins, and coordination tactics - often through forensic digital analysis rather than content alone.


Case Studies: The UK as a Strategic Target

Like other open democracies, the UK has been a consistent target for foreign state disinformation. Actors from Russia, Iran, and China, among others, have conducted sophisticated campaigns to stir division, manipulate narratives, and undermine trust.


Russia

·       2019 NHS dossier leak: Russian-linked actors leaked manipulated government documents during the UK general election to undermine public trust (Raab 2020).

·       FSB Centre 16: Conducted cyber campaigns against UK journalists and MPs for intelligence gathering and reputational disruption (NCSC 2024).

·       Troll farms: Kremlin-linked actors amplified Islamophobic narratives post-terror attacks in Manchester and Westminster, boosting divisive figures like Tommy Robinson (Innes 2020).


Iran

·       Scottish independence operations: Meta, in collaboration with researchers at Clemson University, uncovered a network of over 80 fake X accounts linked to Iran’s Islamic Revolutionary Guard Corps (IRGC). These accounts imitated local UK citizens, promoted Scottish independence, and generated more than 1 million reposts and over 3.2 million likes - representing approximately 4% of the debate on X about the issue (Atlantic Council 2020; Linvill and Warren, 2024). In a related operation during January 2022, Meta removed around 100 fake Facebook and Instagram accounts, again linked to Iran, which were targeting UK users and propagating pro-independence content (Meta 2022).

·       TikTok influence campaigns: A network of 65 Iran-based TikTok accounts was removed in early 2024 after targeting UK users with pro‑Iranian narratives concerning the Israel–Hamas conflict, having amassed more than 116,000 followers (ITV News 2024).

·       National Security Act arrests: Three Iranian nationals charged in June 2025 for targeting Iranian journalists operating in the UK (Reuters 2025).


China

  • Targeting of pro-democracy activists: Chinese-linked accounts encouraged far-right groups in the UK to intimidate Hong Kong dissidents. The 29 accounts published over 150 inflammatory posts, including doxxing information and anti-immigrant rhetoric (Guardian 2025).

·       Digital smear campaigns: State-linked platforms attacked the BBC and journalists with fabricated claims of anti-China bias  (Recorded Future 2021).

·       “Wolf warrior” diplomacy: Chinese officials using social media to promote anti-Western narratives, often through misleading or conspiratorial framing (BBC 2021).


Each case reflects an increasingly integrated strategy: cyber activity, social media manipulation, and psychological targeting designed to test societal resilience. These may not be isolated efforts - they represent a growing systemic, integrated strategy using digital manipulation, psychological targeting, and geopolitical opportunism.


Analysing these disinformation campaigns not only exposes adversarial tactics but also provides valuable insights into their foreign policy objectives, target selection, and modi operandi. Such intelligence can be leveraged and addressed through diplomatic channels and therefore contribute towards deterrence.


Converging Tactics: Foreign States and Non-State Extremists

The tactics used by foreign state actors closely resemble those used by jihadist or far-right organisations:


·       Deepfakes and synthetic media: to impersonate public figures.


·       Hashtag hijacking and astroturfing: Hijacking popular hashtags or organising “rallies” online.


·       Micro-targeting via ad tech: Delivering emotionally charged content based on demographics, location, or behaviour.


·       Echo chamber cultivation: Algorithmic feedback loops ensure ideological saturation, not diversity.


·       Content laundering: Using fake news outlets and influencers to legitimise disinformation narratives.


This is the new radicalisation spectrum: not rooted in singular ideology, but in disinformation as a delivery system for grievance, identity distortion, and violent potential.


The Algorithmic Engine of Radicalisation

Foreign disinformation increasingly acts as a pre-radicalisation incubator. While disinformation alone may not cause violence, it:

·       Shapes attitudes toward migrants, minorities, and institutions

·       Encourages belief in conspiracy theories

·       Destabilises shared reality, making democratic dialogue harder


Social media platforms - designed to maximise engagement - have become unintentional but potent accelerants of this process. Their algorithms favour provocative content and repeatedly expose users to existing beliefs, aligned perspectives, creating filter bubbles and digital silos (Pariser 2011; Flaxman et al. 2016).


Echo chambers, in this context, are not accidental by-products - they are engineered spaces of reinforcement. Foreign interference actors exploit this architecture by seeding content into precisely these spaces, ensuring that falsehoods are not only seen but believed.


Intelligence Dividend of Disinformation

Disinformation is not only an attack - it is also a form of reconnaissance. When state actors inject falsehoods into the digital bloodstream, they are not only distorting reality - they are also observing how societies react. Which messages go viral? Which groups amplify them? Where are trust fractures deepest? Every reaction becomes data. This feedback loop helps adversaries map societal vulnerabilities and fine-tune future influence campaigns.


In effect, hostile states collect sociological and psychological intelligence via the public reaction to their falsehoods. Every individual who reposts, reacts to, or reshapes that content becomes part of this intelligence feedback loop. These insights may be further weaponised in the future.


Strategic Recommendations

To effectively counter these evolving threats, democracies must continually recalibrate national security and digital governance frameworks through coordinated, multi-layered approaches. In this context, deterrence refers to the strategic use of capabilities, clear communication, and credible consequences to reduce the societal impact of disinformation - thereby denying adversaries the outcomes they seek.


1.     Strengthen inter-agency intelligence collaboration: Intelligence, security and law enforcement agencies must enhance intelligence-sharing protocols to prevent the domestic impact of foreign and digitally-originated disinformation campaigns.

§  Identified spikes in disinformation must be swiftly communicated to operational agencies to enable timely safeguarding of vulnerable communities, critical national infrastructure, and significant national events.

§  Intelligence is not only what we gather, but also what is given - domestic intelligence can originate from community networks acting as early warning systems, offering real-time insights into emerging vulnerabilities, sentiments, or tensions before adversarial exploitation. Such intelligence can be harnessed via existing channels, such as local policing units or under the CONTEST strategy umbrella.

 

2.     Create a dedicated cross-government group: A central mission of the group will be to develop deterrence strategies that fortify the public domain and bolster national resilience against foreign influence campaigns. By establishing a permanent, centralised interdepartmental taskforce focused on analysing information manipulation and recommending countermeasures - that also ensures an agile, unified government buy-in and response. The group should not only integrate government departments but also include key actors from the technology sector, academia, civil society, and behavioural science.

§  Suggested title: Strategic Influence and Disinformation Taskforce.

§  This jointly established “think-and-do tank” will bridge intelligence insights, academic research, and public communication strategies to ensure coordinated and adaptive responses.

 

3.     Enhance platform accountability: Mandate greater transparency in algorithmic design and operations, require timely takedowns of foreign and malicious disinformation, and ensure robust moderation practices across all major platforms. To ensure real accountability, regulators must define clear compliance mechanisms - particularly around algorithmic transparency and foreign content flagging. This includes independent audits, disclosure requirements for targeted content, and sanctions for non-compliance. While platforms are commercial entities, their digital infrastructure now serves as critical public space. Effective deterrence depends on ensuring that platform governance continues to align with democratic resilience. While enhancing platform governance is essential, these measures must be transparently implemented with oversight to safeguard civil liberties. Protecting national security must not come at the cost of eroding the very democratic values adversaries seek to undermine.

 

4.     Embed digital literacy and critical thinking: Introduce and standardise digital literacy and critical thinking curricula across schools, policing, and safeguarding sectors to foster critical digital citizenship to empower individuals in discerning credible information. In addition, implement continuous professional development (CPD) training for public sector personnel and frontline responders, who are often the first to detect signs of radicalisation or coordinated influence activity.

 

5.     Conduct proactive national public campaigns: Leverage behavioural science and work with trusted media partners to inoculate the public against disinformation - especially ahead of major political or electoral events. Consider piloting regionally tailored campaigns first, with feedback mechanisms to measure message retention and impact.

 

6.     Monitor spill-over and narrative evolution: Track how disinformation morphs into misinformation and disseminates across platforms and demographics. Cross-platform narrative monitoring is crucial to anticipate emerging radicalisation trends.

 

7.     Coordinate internationally: Develop joint response protocols and shared threat taxonomies with allied nations, particularly around safeguarding democratic institutions and electoral processes. Propose the establishment of a Five Eyes – led working group dedicated to disinformation deterrence and intelligence-sharing to strengthen allied responses.


Conclusion

Foreign state interference is not simply a matter of national security - it is a societal resilience challenge with generational implications. It is no longer a peripheral threat - it is central to how radicalisation, social fragmentation, and digital addiction now unfold. Whether amplified by bots or spread by well-meaning citizens, disinformation becomes both the weapon and the environment in which today’s security threats evolve.


This is not espionage in the traditional sense - it is strategic social disruption, often unfolding gradually.


This threat must remain central in our national security and resilience agendas. It is central to how violence, mistrust, and division are engineered in the 21st century. And just as foreign actors learn from our vulnerabilities, we must learn to defend against theirs. Deterrence is essential - resilience must become a national priority to protect democracy, safety, security, and peace from this growing threat.

 

References

Atlantic Council (2020) Iranian digital influence efforts: Guerrilla broadcasting for the twenty first century.

Washington, DC: Atlantic Council. Available at: https://www.atlanticcouncil.org/in-depth-research-reports/report/iranian-digital-influence-efforts/


BBC News (2021) ‘The disinformation tactics used by China’, 10 April. Available at: https://www.bbc.co.uk/news/56364952


Linvill and Warren (2024) What’s hiding under the kilt? Iranian Trolls for Scottish Independence. Clemson University Media Forensics Hub. Available at: https://www.clemson.edu/centers-institutes/watt/hub/images/hiding-under-the-kilt1.pdf


Flaxman, S., Goel, S. and Rao, J.M. (2016) ‘Filter bubbles, echo chambers, and online news consumption’, Public Opinion Quarterly, 80(S1), pp. 298–320. Available at: https://academic.oup.com/poq/article/80/S1/298/2199870


Guardian (2025) ‘Online campaign urged far right to attack China’s opponents in UK’, 28 April. Available at: https://www.theguardian.com/world/2025/apr/28/revealed-online-campaign-urged-far-right-to-attack-chinas-opponents-in-uk


Innes, M. (2020) ‘Russia report: Moscow’s disinformation campaign fuelling ‘political extremism’ and division in UK’, The Independent. Available at: https://www.independent.co.uk/news/uk/home-news/russia-report-uk-national-security-brexit-terror-islam-a9630126.html


ITV News (2024) ‘TikTok cracks down after finding groups operating from Iran targeting UK users’, 24 May. Available at: https://www.itv.com/news/2024-05-24/tiktok-cracks-down-after-finding-iran-backed-groups-targeting-uk-users


Meta (2022) ‘Meta removes Iran-based fake accounts targeting Instagram users in Scotland’, Reuters, 20 January. Available at: https://www.reuters.com/technology/meta-removes-iran-based-fake-accounts-targeting-instagram-users-scotland-2022-01-20/


NCSC (2024) Russia’s FSB malign activity: factsheet. London: National Cyber Security Centre. Available at: https://www.gov.uk/government/publications/russias-fsb-malign-cyber-activity-factsheet/russias-fsb-malign-activity-factsheet


Pariser, E. (2011) The Filter Bubble: What the Internet is Hiding from You. London: Penguin.


Raab, D. (2020) ‘UK says Russia sought to interfere in 2019 election by spreading documents online’, The Guardian, 16 July. Available at: https://www.theguardian.com/uk-news/2020/jul/16/uk-says-russia-sought-to-interfere-in-2019-election-by-leaking-documents-online


Recorded Future Insikt Group (2021) ‘China Propaganda Network Targets BBC Media, UK in Influence Campaign’, 18 August. Available at: https://www.recordedfuture.com/research/china-propaganda-targets-bbc-uk


Reuters (2025) ‘Three Iranians in UK court accused of assisting Tehran spy service’, 6 June. Available at: https://www.reuters.com/business/media-telecom/three-iranians-uk-court-accused-assisting-tehran-spy-service-2025-06-06/


Wardle, C. and Derakhshan, H. (2017) Information Disorder: Toward an Interdisciplinary Framework for Research and Policy Making. Strasbourg: Council of Europe. Available at: https://rm.coe.int/information-disorder-toward-an-interdisciplinary-framework-for-researc/168076277c

White on Transparent.png
 © 2023 - 2025 DETERRENCE CENTER
The content of this site is copyright-protected and is the property of Deterrence Center Ltd.
bottom of page