In a simulation exercise recently, one of the team said: “We’ve been through something like this before. We know how to handle it. It’s business as usual.” You could feel the team relax in the room. They were treading a well-worn path and had the experience and the tools to handle it.
An hour or so later, they saw they’d missed a couple of clues that this crisis was, in fact, different. Because it looked familiar at the start, they had made assumptions about the outcome. But small things – the timing of the crisis, a subtle shift in public conversation, an attack from an unexpected source – meant it mushroomed quickly and had the potential to be extremely serious.
They were a fantastic team, and adapted quickly. But it took time to recover from that initial miscalculation.
I wrote during the pandemic about the ability of people to believe they will survive anything based on their past experience of surviving a crisis (in a blog post Why do people flout lockdown guidelines?). During the Second World War, a Canadian psychiatrist JT MacCurdy researched the impact of fear on morale. He found that when bomb sirens became commonplace, people weren’t afraid any more. They got bored of the warnings, and stopped going into shelters. They started to believe they wouldn’t be hit. They’d survived attacks before, and they’d survive them again.
The emergency planner Lucy Easthope addresses this in her brilliant new book ‘Come What May’. During Hurricane Katrina in 2005, she says, many older residents believed they would survive the hurricane based on their experience of surviving previous weather events, in particular Hurricane Betsy in 1965. She talks about the power of survivable near-misses, that lead to what she calls a ‘fatal bias’, a sense that ‘we’ve got this.’
We see this ‘fatal bias’ so often in teams going through a crisis simulation exercise. They frame the crisis as something they know and understand. They’ll be fine. Their desire for the event to be controllable or survivable overpowers their ability to see what’s different about it, what has the potential to be really damaging.
When a team sees what they want to see, or what their experience tells them they can manage, this bias seeps into their thinking and responses. They stop seeing the detail, because they’ve assumed they know how the crisis will unfold. They don’t see new patterns emerging because it doesn’t suit the narrative that this is something they’ve handled before. Suddenly, they’ve lost control and they have to pivot quickly.
How do we avoid this?
- Treat every crisis as though it’s brand new. Look at the detail with fresh eyes and get someone who hasn’t gone through it before to make an assessment of what’s happening
- Actively look for differences. What is new about this situation that you haven’t seen before? What could the worst case scenario be?
- Listen carefully. What sounds new and different from previous situations? Give your whole team the space to voice their concerns – that tiny voice in your head that says ‘what if…?’ could be the difference between success and failure
- Role-play. Think about the crisis from different perspectives – it might help you see something new.
Above all, ask yourself: “Am I treating this crisis as the same because I want it to be?”
–
Featured photo by Sorin Gheorghita on Unsplash