“The spirit of our endeavour is, To strive, to seek, to find and not to yield”

Alessandro Minuto-Rizzo, President

The Cognitive Battlefield of Hybrid Warfare

Source: Battlefield 6.
Source: Battlefield 6.
The explosion of hybrid conflicts marks the rise of an invisible war. The adversary may cut or latch on submarine cables, jam or eavesdrop satellites that power global networks, manipulate social media feeds, and swarm the skies with unflagged drones. Hybrid warfare often takes the form of a small, remotely operated UCAV (Unmanned Combat Aerial Vehicle) whose mission could be to launch or trigger an explosive device; in fact, it carries a sensor package with a lethal and insidious objective: to observe, scan, map, and transmit sensitive information. This is how the mosaic of cyberattacks, unconventional incursions and silent sabotage is set up.
It is a war that does not declare itself, yet it paralyzes infrastructure, sows chaos and erodes public trust. Cyberattacks on power plants, hospitals, and military networks; unidentified drone incursions over European airports and airspace; and deniable paramilitary actions are all part of this new operational landscape.
The common thread binding these operations is disinformation: an assault on the mind. NATO is under siege, particularly along its cognitive frontiers. Disinformation campaigns orchestrated by state and non-state actors aim at manipulating public opinion, polarising societies and undermine internal cohesion. Deepfakes, troll farms, bots, and fake news are the new shock troops, deployed to influence elections, distort political narratives, destabilize governments and alliances and erode trust in democratic institutions. The recent Moldovan and Rumanian elections are playbooks of this old/new art of war.
Disinformation tactics focus on amplifying internal vulnerabilities. Three prominent examples include:
  • Migration “weaponisation” to destabilize borders;
  • Mirror propaganda accusing Europe of aggression;
  • Mounting pressure on NATO’s eastern flank to compromise energy independence, critical infrastructure protection, supply chain stability, and communication continuity.
Defence is no longer solely military: it is psychological, social, digital. The challenge is quite serious, because the next war has already begun. And it is fought with truth and with counter-(dis)information.
Source: NDU, Linton Wells.
To counter the rampant proliferation of adversarial deepfakes, NATO must deploy generative adversarial network (GAN), voice and facial synthesis software capable of exposing hostile intrusions and generating counter-content. During the violent rise of ISIS and the self-proclaimed Daesh caliphate, the web was flooded with videos glorifying jihad against the West. Neural networks were activated to saturate the web with interpolated videos promoting the Dash detergent brand – a simple yet effective idea that blurred, confused, and diverted attention.
A distinct strategy is required to counter bots and troll farms on social media, that operate thousands of automated accounts spreading conspiracy theories, false news and polarizing content. Digital platforms, through their recommendation algorithms, tend to amplify previously viewed content, creating “echo chambers” that reinforce preexisting beliefs. This mechanism has been exploited to disseminate propaganda and targeted disinformation.
In this context, the appropriate countermeasure is the sabotage of engagement algorithms and behavioural targeting systems: to fragment monolithic narratives, foster critical thinking, and minimize radicalization. This includes infiltrating complex networks of websites, influencers, and Telegram groups by duplicating satellite sites, dismantling closed communities through mimicry, and recruiting compliant influencers.
With the advent of generative artificial intelligence, it is now possible to fabricate articles, images and videos in seconds. These tools can simulate internal crises within adversarial states, spread fabricated news, and create misleading viral content. Eye for an eye, there is no alternative. NATO countries must develop their own architectures of silent and devastating deception: a covert network of counter-information, algorithmic sabotage, and strategic manipulation to defend the Alliance’s cognitive borders.
Here are some conceptual elements of these interoperable architectures:
  • Counter-disinformation: Detecting and neutralizing false narratives before they spread.
  • Algorithmic sabotage: Disrupting adversarial AI systems or poisoning their data pipelines.
  • Strategic manipulation: Shaping narratives proactively to inoculate populations against hostile influence.
  • Transparency and resilience: Educating the public, building media literacy, and strengthening democratic institutions.
  • AI governance: International norms and treaties to limit the use of generative AI in information warfare. This is an even more crucial aspect to be discussed at least with China and Russia regarding the integrity of deterrence chains of command vis-à-vis AI disruptive tools.
  • Tech partnerships: Collaborating with platforms and developers to detect and mitigate synthetic media threats.
  • Deepfakes & AI-generated images: Used to fabricate events, impersonate leaders, or simulate crises. A notable example was the fake image of a Pentagon explosion that briefly crashed the stock market.
  • Automated propaganda: Troll farms and bot networks use generative AI to flood social media with persuasive, misleading content
  • Psyops: AI models analyse population sentiment and tailor messages to exploit fears, biases, and divisions. The objective is to destabilize societies from within, without firing a single shot.
  • Autonomous Decision-Making: AI agents assist in war planning by simulating scenarios, identifying vulnerabilities and optimizing strategies faster than human analysts. These tools accelerate decision cycles and can outmanoeuvre slower, traditional command structures.

 

By Gerardo Spagnuolo, Specialist in semiotics, emerging technologies, information warfare, soft power strategies and crisis communication skills.

Share on

Archive

Subscribe to our newsletter