For many years, communicators (including me) have told companies to focus on communicating only high-quality information during a crisis. A crisis is no time for niceties, or corporate messages. No-one cares that safety is your top priority if 10 people have been injured in an accident that’s your fault. In the most serious crises and disasters, getting the right information to the right people quickly can be the difference between life or death.
But in some crises, our communications should be more nuanced than that. Particularly when we’re dealing with deeply-held beliefs, often fuelled by dis- and mis-information.
We saw this during the pandemic, when people were setting fire to 5G masts in the belief they helped spread Covid. No amount of scientific debunking in mainstream media and on social media could convince people it wasn’t true.
Sharing scientific, verified information about how Covid spread wasn’t enough to counter belief fuelled by disinformation.
Why is that?
The Alan Turing Institute has a fascinating blog post by Trisevgeni Papakonstantinou, a cognitive scientist who has studied belief systems. She says our beliefs are part of a large system of ideas and beliefs, that are interconnected and interact with one another. They don’t exist on their own and they can reinforce each other in a way that ‘makes them resilient to contradictory evidence.’
She gives the examples that people who are sceptical about vaccine effectiveness are also more likely to believe parental intuition is superior to medical expertise, or that people who don’t believe that humans are impacting the climate are more likely to trust alternative medicine.
So, if you introduce new information to counter one belief, you’re not just dealing with that belief on its own. You’re dealing with a whole network of beliefs that are tied up with our own identity and self-image. That means that when we are given facts to counter a belief, we might be completely resistant to that new information – it could undermine our whole identity.
Not only might we resist the information, it could entrench us further into those beliefs. The new information threatens to undermine who we are, so we use a defensive mechanism to explain it, and to avoid the cognitive dissonance that comes from holding conflicting information in our brains. So we decide the information must be fake news, or part of a wider conspiracy to hide the truth. We create new beliefs to strengthen the old ones and reject the new information.
That’s really hard for crisis communicators to deal with. But in the book Poles Apart by Alison Goldsworthy, Laura Osborne and Alexandra Chesterfield, the authors talk about the role of human connection and stories in gradually changing beliefs and counter polarisation.
If we turn an argumentative position backed by data (the ‘you’re wrong, I’m right’ approach) into a more human-led story that can create a connection between different groups of people, we activate empathy in our brains. We’re more likely to listen with an open mind. We can hold new information within our existing value framework.
Perhaps the answer lies in doing what communications does best – finding things that connect us – emotion, shared experiences, human stories – rather than purely presenting information that can drive us further apart. Even in a crisis.
–
Photo by Tom Barrett on Unsplash