The Disinformation Nations
The world information wars are ramping up, and we're not ready to fight.
Sviatlana Tsikhanouskaya almost certainly won the last Belarusian general election. The incumbent’s lies kept her out of the job.
Yevhen Fedchenko has been defending against a Russian information invasion for eight years. Moscow says he’s a spy.
Andrew Shearer is the head of Australia’s primary intelligence analysis agency. His country is facing a constant break-in attempts from China.
Yasmin Green is the CEO of Google’s in-house thinktank, dedicated to fighting extremism and countering disinformation. She’s struggling to even understand the scope of the problem.
Chris Coons is the Senator for the politically even-keen state of Delaware. Two years ago, he ran against a challenger who believes in QAnon and that everyone who stands with Ukraine is either “transgender, a Satanist, or a straight up Nazi.”
The Halifax International Security Forum brought together these five very different people with five very particular experiences on the frontlines of the information war as part of a panel on The Disinformation Nations — moderated by Evan Solomon. On a very special edition of Bug-eyed and Shameless, we’re going to hear from these experts, and break down some very particular disinformation problems and misinformation solutions from Belarus, Ukraine, Australia, and the United States.
Sviatlana Tsikhanouskaya
A state monopoly on disinformation
Belarusian President Alexander Lukashenko's regime is built on lies. For many, many years — using state TV, state newspapers — they wanted to show people that they are living in a wonderful country, an island of stability, and that Western countries are rotten.
With the help of internet, with the opportunity to travel, the younger generation got the opportunity to visit other countries, to see how, in reality, other people live. They started to ask questions to themselves. First of all, do we really live in wonderful country?
All this burst into a huge uprising, in 2020, after fraudulent elections. What did he do in that moment? Lukashenko just blocked the internet for a couple of days. The only opportunity for the protesters to communicate, at that moment, was Tiktok. So he started to look at alternative media in our country. He declared all the media are extremists and forbade people from watching Youtube and reading alternative media. The media had to flee Belarus, they relocated and started to work from exile.
So now when you are kidnapped by the KGB, and they see that you are subscribed to one channel on Telegram, or to alternative media: Away to prison. You will be detained for many, many years.
Tsikhanouskaya is an unlikely leader. It was her husband, Sergei Tikhanovsky, who became the de facto head of Belarus’ freedom movement.
Tikhanovsky, a successful Youtube vlogger, had eclipsed Belarus’ loyal opposition political parties and its corrupted state media to become the most powerful opposition voice in the former Soviet republic. He was such a threat that he, and some of his high-profile supporters, were arrested. Tikhanovsky was banned from running.
So Tsikhanouskaya stepped up. Riding a wave of discontent with Lukashenko, “Europe’s last dictator,” she launched a massively popular campaign, pulling tens of thousands of people into the streets. According to one independent analysis, she secured 56% of the vote in the first round.
Lukashenko, of course, declared victory. The state said Tsikhanouskaya won just 10% of the vote, against Lukashenko’s 81%. Her husband was tried and convicted for leading a violent insurrection and she left the country to advocate for a free Belarus from exile.
In Halifax this weekend, Tsikhanouskaya called for help. She also thinks the information war against Minsk and Moscow is one that can be won.
People know how to avoid this restrictions. They're using VPNs, they are deleting the history of the searches — just in case they are detained. This fight is to a large extent, an informational war. We have to counter Russia and Belarusian propaganda, because they reiterate the narrative that Ukrainians are our enemies, that Ukrainians are nazis, about the rotten West. They are looking for external enemies.
So this is why I'm asking all the democratic countries with to help civil society, to help our media in effectively contradicting the Russian propaganda. We have to have access to more information. We have to develop investigative journalism in Belarus.
I asked Tsikhanouskaya about her husband. The conditions in the prison are brutal, she says. A lawyer visits once a week “just to check if he’s alive.” But, she says, the political prisoners in Belarus’ jails are strong.
“They believe in us. They believe in a democratic society.”
Yevhen Fedchenko
Total information war
In order to understand the future of the war in Ukraine, we need to understand the information component of it, to analyze it. We've been working on it on a daily basis since 2014 — looking into those black depths of Russian disinformation. As you might imagine, this zone is not the most pleasant place to be, but someone needs to do it. We collected a huge data set as we were approaching Tuesday, February 24. We saw the increasing volumes of disinformation, and were analyzed the main narratives which Russia was using to justify this invasion against Ukraine.
Looking at this data set we collected, we could also look at how these narratives change. This is very important because it explains what Russia is going to do next: Militarily, politically, diplomatically. At the beginning, they were preparing justification for this war. Then they realized nobody believes it anymore. So they came to the next stage, they started to sow doubts about Ukraine — like Ukraine is selling weapons which are supplied by allies; Ukraine is a corrupt state, why you should not give any money to Ukraine; etc. So they’re in the business of sowing doubts. And as you might imagine, there's an audience for that. And now they're going to the next stage, blaming Ukraine for food prices, energy prices, any other sins which are happening now in the world.
This is a part of the war efforts which Russia was building for years. And the biggest problem is that this infrastructure is still intact. It did not disappear. They are using the same platforms. They're using the same networks and influencers, and it will take quite a lot of time to stop this machine.
Fedchenko’s claim to have foreseen this problem is pretty well-founded. In 2014, shortly after Russia’s annexation of Crimea, Fedchenko and some colleagues founded StopFake, a shockingly agile and effective outlet that doesn’t just debunk Russian propaganda: It pre-empts, contextualizes, and explains it.
Organizations like StopFake, credible media outlets like the Kyiv Independent, and reporters who are willing to report on the frontlines of this war are creating a, frankly, remarkable resiliency against Russia’s information war.
I’ve relied on StopFake’s work recently while reporting on Moscow’s propaganda campaign on nuclear weapons, and they were instrumental in my reporting about Russia’s batshit biolabs conspiracy theory.
Russia has leaned into the effort to demonize and vilify Ukraine: Painting the war as a justifiable quest for security against a nefarious neighbour; colouring victory as inevitable; sketching NATO involvement as the real aggression.
Figuring out how to short-circuit this campaign is going to take more time. The easy short-term win is to keep doing exactly what drives Moscow mad.
At the Forum, Republican Senator Jim Risch underscored that even in the acrimony of Congress, bipartisan support for Ukraine has withstood this full-scale psychological warfare. “There are only a handful of people that that are balking at engaging in this struggle in Ukraine,” Risch told reporters. “They're getting a whole lot of heat from you guys. But there's obviously 535 [members of Congress and Senators] — they probably make up for half a dozen or so. So focus on the majority.”
An emerging problem isn’t that Russian propaganda is working on Americans, Canadians, Europeans — it’s that misinformation here is being weaponized there. As I’ve written before, there is a pipeline from QAnon to Tucker Carlson to Russian Telegram channels to Russian state TV. Russia’s main problem has always been tailoring its message for English-speaking audiences: We’re solving that problem for them.
And, increasingly, Russia is figuring out that its most receptive audiences aren’t in New York, Toronto, or London. They’re in Mexico City, La Paz, Kampala, Bangkok, and broadly in the Global South. Even if we’ve limited the reach of Russian propaganda here, we have done nothing to limit it in the Global South.
StopFake is fighting that fight: It is now available in 13 languages.
Andrew Shearer
The security state responds…kind of.
As an intelligence professional, I'd say: We shouldn't be too surprised. These disinformation techniques have been around as long as there's been statecraft. In the Cold War, it was Soviet active measures. The labeling sometimes shifts, but there’s a deep desire to reach into our societies to shape our narratives, to steal our secrets, to sow seeds of domestic discontent, and so forth. It is a very long story. When we look at we look at what we're up against, it's incredibly daunting.
Perhaps because I'm historically minded, I take some comfort from the challenges we've overcome in the past. We did rise during the Cold War to that challenge. During the ISIL explosion, we responded effectively — we developed and adapted our toolkit, we we formed new partnerships, outside government across government.
I can't really be here and not not give a shout out to my American counterparts, and the entire US intelligence community, for the masterclass that they put on, leading up to, and following, the the Russian invasion. American intelligence completely reshaped the international narrative, and gave the West a strategic initiative in an incredibly powerful way, against a guy who made his whole life about these techniques that we're talking about here. So, you know, there there are some successes there.
What's changed? Technology. Bots, troll farms, etc. With the speed and scale, that’s something we're all grappling with.
What's the intelligence component? Our traditional function —strategic warning, actionable intelligence, collecting information that enables us to verify what's going on and making sense of it — none of that's changed. But we have to do it in this incredibly dynamic, ultrafast, massive, high-volume information world. I think that's a real challenge.
From an intelligence point of view, we need the technology and the tradecraft to do open source collection at a massive scale.
I have mixed feelings about the recent trend of spychiefs taking on a more public role.
On one hand, I am a deeply paranoid person. I, unlike some people in the public realm these days, haven’t forgotten the Snowden leaks. I still harbour a deep distrust for our governmental national security agencies — particularly because of how they eviscerated our right to privacy to expand their own surveillance powers, and because of how they enabled the deep mistrust of Muslims in the West.
At the same time, I’m deeply concerned about the rising threat of domestic extremism and foreign interference. On that challenge, we need our intelligence and security services to do more, not less.
I was feeling that tension seven years ago, when I profiled Admiral Mike Rogers, then the affable head of the National Security Agency, and his “charm offensive.” (Also a Halifax Security Forum alumni.)
So I listened to Shearer with mixed emotions. Yes, I agree, the Five Eyes need to be clear and direct about the treat of domestic radicalization and foreign information operations. But when I hear the phrase “open source collection at a massive scale,” the hair on the back of my neck goes up. We need security services are are able to monitor and track extremist narratives online, and respond before they emerge into real-world violence. But that apparatus can’t be then used to make criminals and terrorists out of legitimate democratic actors, be they on the left or right; Indigenous, environmentalist, separatist, or libertarian.
We’ll figure out that problem by having this debate out in the open, instead of setting the rules behind closed doors.
During the panel, I asked why we have recently become so allergic to using the word “deradicalization.” Whether it’s someone indoctrinated by Russia, or someone who has been convinced that Jews are an international cabal deadset on destroying the white race — how do we bring them back to reality, in a non-coercive manner? Do we have to wait until those individuals leave to fight with the Wagner Group, or attack a synagogue? While Shearer may have good things to say about how experience in combatting the Islamic State’s radicalization machine, I’m not quite as complimentary.
Yasmin Green
Big tech surveys the field
We have a kind of ongoing panel of people who are conspiracy-minded and consumers of disinformation. What we're seeing is that there is now this intersectionality of conspiracy belief, disinformation purchase — that intersects health misinformation, that intersects election denial in in the U.S., that spans ‘5G is a threat’ — and there's a whole industry around mobilizing these people. There are conferences, like ‘The Great ReAwakening.’ They tour and they convene people: Religious leaders, discredited medical professionals, political pundits. And they're selling merch: T shirts, banners, metal coins to protect you from 5G, medical cures.
There is something that has taken a hold of us.
I remember when we did our first study of conspiracy theorists, I said: ‘I'm really interested in having some people who believe in health disinfo, some people who believe like false flags, some people who believe in the great replacement.’ So I gave the mandate and then we got 100 people together. There wasn't a single person who only believed in one conspiracy theory.
In the case of the the midterms, I gotta say election-denier domain is really felt the efficacy of effective moderation. Not to say that it's flawless, but they were really frustrated. Their posts about the election being stolen or about election irregularities were being taken down and they were so frustrated with the media platforms that they had to go elsewhere. And I think that's something that we must increasingly consider: The platforms that are private. So, Telegram or private messaging apps and the platforms that are hyper-partisan. Because those that are popular in the U.S. — Parler and Rumble — they don't even have misinformation policies
Those guys have their marketing stickers that they don't have any policies against misinformation. So I think there's challenges getting more complex.
Green is the CEO of Jigsaw, Google’s moonshot attempt to make the internet more of an unqualified source of good.
It has been, in short, a mess.
Google, Facebook, Twitter, Tiktok: All of these companies bear a tremendous amount of blame for leading us into our current radicalization and disinformation problem. Their algorithms juiced overheated rhetoric and privileged extremist voices. They viewed this problem as secondary to short-term profit generation, and implemented ham-fisted moderation as an after-thought.
But Green is exactly right, here. The collapsing of all conspiracy theories into a single political ideology, paranoid populism, is a terrifying development. (I’ve been writing about this for the [long sigh] two years.) I’ve been grappling with what to do about this. Green is right: Moderation has worked. But for whom?
Yes, you’re much less likely to encounter election denialism and anti-vaccine nonsense on your Facebook or Youtube feed or in your Google search results. But the disinformation has just moved elsewhere: To Parler, Gab, Rumble, Bitchute, DuckDuckGo, 4chan, Truth Social, Telegram, Odysee, Locals, etc.
True, moving this disinformation from a platform with 80% uptake to one with just 10% theoretically limits the pool of people who can be radicalized. But it also intensifies the radicalization for those who opt for these platforms. And we still don’t know what it will look like if, say, Rumble begins to rival Youtube, or if Twitter reverts back into something Gab. The temperature on these alt-social media platforms has risen to such a degree that it could scorch everything in its path.
Chris Coons
A House divided against itself cannot drain the swamp
The citizens of countries like Iran and North Korea, Belarus or Russia have great difficulty accessing reliable independent information and the regimes there maintain their control of the populations by shutting off access to the rest of the world. We fund and support — and many in the private sector have helped develop — cutting-edge tools to ensure access to reliable information for those who are living under regimes that disconnects them from the rest of the world.
There's also huge challenges in open societies: There have been intentional efforts to interfere, through disinformation and misinformation, in Canada and Australia and the United States. Obviously, it's had a significant impact on our response to COVID. I would argue one of our great challenges legislatively, the United States Congress, is both understanding and coming to grips with the impact of social media: On our body politic, on our sense of our selves, and on our ability to focus.
There are platforms — I'll call out TikTok right off top — that are malevolent, that are both siphoning off huge amounts of data, being used as a tool of state power, and are significantly distracting. But getting my kids to stop looking at tick tock every day every minute, is enormously challenging. It's engaging, I would even say addicting.
I've got one legislative proposal around The Platform Accountability and Transparency Act, a bipartisan bill that would — with protections — subject social media algorithms to academic study into analysis, so we know whether or not Instagram really is demonstrably harmful to young people. So we know whether or not state actors are able to use some of these tools to celebrate to accelerate radicalization or to shape our political space.
Congress has not yet done the work that the EU has, with its digital services law and that other democracies have. The United States needs to show leadership on digital privacy on protecting the rights of individuals to know and own what companies are taking from them. And we need to better resource our work in countering disinformation.
Putin is increasing his investment in RT and Sputnik and others. The Chinese are increasing their investment in their platforms. We're not doing a particularly great job of it in the United States. I don't really ever expect the State Department — that model of crisp, concise, edgy, funny communication — to be at the cutting edge of our engagement in the world. Our culture is one of our greatest sources of soft power. We could be more successful at countering disinformation.
Senator Coon’s comments are rather refreshing. It’s a clear-eyed look at the scope of the problem, with solutions that actually seem feasible.
His version of “algorithmic transparency,” for example, involves giving independent researchers targeted and structured authority to test social media algorithms and not, as some have unrealistically called for, to publish all those algorithms publicly. (It’s a nice idea. It won’t happen.)
He also recognizes that Western powers effectuate change by blowing holes in authoritarian censorship regimes and by providing arms-length funding for good journalists on the ground and not by dispatching the State Department to be the singular voice of truth abroad.
He correctly diagnosis TikTok, as well, as both a source for data capture and an opiate of the masses. While we shouldn’t go too far in declaring that Beijing is trying to brainwash with fun dance videos, there is an interesting conversation to be had about the soft power of information in contrast to the hard power of disinformation. That is: Deluging the West with addictive and fluffy content, interspersed with conspiracy theories, could have a more direct impact than, say, heavy-handed Russian disinformation. Beijing is, without question, getting better at figuring out how to get the masses hooked on its content.
It’s great to hear a sitting senator talk clearly about these issues. But we’re still waiting on action.
I’ve opened comments to everyone, because I’m curious to hear your thoughts on the various perspectives above.
Expect another Halifax dispatch tommorrow!
The above transcripts were edited for clarity and brevity. While the editing was liberal, I worked hard to preserve the intent of each speaker.
Thanks for the great coverage. I enjoy your insightful reporting. I have recommended, and just finished “The Chaos Machine” by Max Fisher. While the American Senator Coons has proposed legislation, which may seem laudable, the afore mentioned book documents years of presentations by academics and investigative journalists to Congressional and Senate Committees regarding the algorithms used for years by Social Media Corporations. I may suggest this is yet another example of “kicking the can down the road”.