Lies, deceit and disinformation: The fake news with the power to change the course of war
Can fake news really change the course of a war?
Behind the scenes, hostile states can distort the stories that land on your newsfeed and change the way you see the world.
"Info Ops" have been part of warfighting for decades, but in a digitised world, supercharged by social media, the opportunities to plant disinformation have exploded.

Scottish nationalists - or state-backed operatives?
Take the US stealth bomber strike on Iran in June 2025.
It was the kind of moment built for headlines: US Air Force B-2s dropping bunker-buster bombs on Iran's underground nuclear facility at Fordow.
This was a very public demonstration of American power, reported around the world. But something else happened at the same time and barely made a ripple.
A cluster of "Scottish nationalist" accounts went offline at the exact moment the strike knocked out Iranian internet access, a pattern observed again during later internet outages imposed by Tehran to control street protests.
So here's the question: why would genuine Scottish nationalists be based in Iran? Or are we actually seeing regime-sponsored operatives with fake accounts posing as credible supporters of Scottish independence in order to destabilise the UK?

Lies and deceit
Disinformation, when people or states deliberately spread information they know to be untrue, is a powerful weapon in the miliary arsenal.
Not because it convinces everyone, but because it confuses enough people for reality to becomes negotiable.
If you want to see what this looks like at scale, look at Moldova, a country on the edge of Russia's war in Ukraine and a live test-bed for hostile influence operations.
Ana Revenco, the director of the Moldovan Centre for Strategic Communication and Countering Disinformation, says this is happening at an industrial scale.
She said the tactics being used against Moldova included deepfakes, cheapfakes, AI-promoted content and networks of thousands of inauthentic accounts constantly engaged in promoting lies and fakes.

Is the UK facing something similar?
Former National Cyber Security Centre chief Ciaran Martin recalls an audio deepfake of London mayor Sadiq Khan in 2023.
This falsely depicted him making inflammatory remarks in the run-up to Remembrance Sunday. It was circulated widely and caused a lot of anger and confusion.
"I'm not saying this is Russia because I think we genuinely don't know who did this, but no doubt you'll recall the awful but very perniciously clever deepfake of the London mayor in November 2023 about the pro-Palestinian march and its clash with Remembrance parades," he said.
"It turned out that was not an offence to make that."
Fake facts, real fear
A recent study by King's College London looked at the spike in the use of the words "dangerous" and "lawless" to describe the UK's capital city.
In 2008 they appeared 874 times, but by 2024 the same words were used more than 258,000 times.
Lord Toby Harris, the Chair of the National Preparedness Commission, said: "If you're told all the time crime in London is out of control, and there are bots feeding that dialogue all the time, you become more fearful about what’s going on.
"Actually crime statistics suggest crime in London has declined, but if that's the message you keep receiving you become more fearful."
There's no clear evidence that says this spike in negativity about London is down to Russia, or any other hostile state, but this kind of amplification of anxiety and fear follows a familiar pattern.
Now-exiled Russian investigative journalist Andrei Soldatov said: "They don't have to invent fractures in Western society, they just exploit what already exists.
"They didn't create Trump for instance, they didn't create ultra-right movements in Europe, but they're trying to exploit these feelings."

Russia's early internet playbook
Long before "deepfake" became a common term, Yevgeny Prigozhin, the now-dead infamous leader of the Russian mercenary Wagner Group, set up the Internet Research Agency.
This employed hundreds of people to post, provoke and amplify the Kremlin's message.
The agency is widely believed to have played a role in interference with the 2016 US presidential election, prompting the United States to launch a counter-cyber operation against the Internet Research Agency in 2018.
The FBI also issued indictments for the arrest of 11 Russian military intelligence officers for alleged offences connected to election interference.
Enemy disinformation isn't exclusively conducted in faraway places, either.
Take the case of Nathan Gill, a former leader of Reform UK in Wales, who was convicted in 2025 of accepting bribes in exchange for public statements supporting Russia and sentenced to a lengthy prison term.
Mr Soldatov explained: "Moscow is trying to use these narratives to erode public support for the war in Ukraine and counts on a sense of exhaustion.
"They think if they can add to this sense of fatigue 'why should we care and why we should pay for that?' This kind of message is exactly what Moscow would love to support."

What the UK does and doesn't do in the information domain
In the UK, the British Army's 77th Brigade is often described as a dedicated unit for countering hostile disinformation and conducting information operations of its own.
But there's a key imbalance that comes up in conversations with experts: authoritarian states can operate at scale in a way Western governments often can't or won't.
Different constraints. Different rules. Different appetites for risk. And that means the public becomes the pressure point.
The aim of disinformation is to sow doubt and fuel division - to prise open cracks in public debate and erode trust.
The people behind it know what they’re doing, and it's easier than you think to get pulled in.
So next time a clip makes you instantly furious:
:: Pause
:: Check where it came from
:: Ask yourself who benefits?
:: Look for the original source
Because in information warfare, your likes, your retweets and your comments are the prize.







