Tuesday 30 April 2013

Book of the month: Sitting in the hot seat by Rhona Flin

Before I begin this review, I have to confess something: Dr Flin is one of the greats in human factors research and my writing a review of one of her books may seem hubristic. I therefore appeal to you (kind reader) to take all comment in the spirit in which it is meant, as a subjective appraisal by a neophyte of an expert's work.

Flin's book focuses on the individual (pilot, police officer, manager) who leads the on-scene response to an emergency. She also provides an overview of how different services train and assess these individuals. At the same time, Flin makes suggestions on how to improve the preparedness of these individuals. Flin's book was published in 1996, almost twenty years ago, and although aviation had been making strides with Cockpit (later Crew) Resource Management (CRM), the lessons from the flightdeck had only been taken up by other professionals in a haphazard manner. 

Flin's first chapter starts with a quote from the Cullen report into the Piper Alpha disaster in which 167 people died:
"The failure of the OIMs [offshore installation managers] to cope with the problems they faced on the night of the disaster clearly demonstrates that conventional selection and training of OIMs is no guarantee of ability to cope if the man himself is not able in the end to take critical decisions and lead those under his command in a time of extreme stress."
It is possible that only a few men died in the initial explosion; the majority of the remaining crew followed standard procedures and gathered in the accommodation block to await further instructions from the OIM, which never came.

Later on in chapter one, Flin provides further quotes from the Cullen report which give us insight into the actions of the OIM:
"The OIM had been gone 'a matter of seconds when he came running back' in what appeared... to be a state of panic..."
"One survivor said that at one stage people were shouting at the OIM and asking what was going on and what procedure to follow. He did not know whether the OIM was in shock or not but he did not seem able to come up with any answer."
The Cullen report led to a focus on safety management systems and on the process of OIM training and selection. The report called for regular training of OIMs in emergency exercises which would allow them to practice decision-making in a stressful environment.

The parallels with my own job as an anaesthetic consultant are clear: 99% of my work is straightforward and routine, 1% of the work is crisis management requiring rapid action to prevent patient harm or death. In 1990 Cullen realised that the only way to make sure that the OIMs were prepared for the 1% was to practice and simulate. Although we have made a start in healthcare and in many respects Anaesthesia is ahead of the game, we still today do not practice for rare events frequently enough.

In the remainder of chapter 1, Flin supplies definitions and provides an overview of the incident command and control procedures in the emergency services, hazardous industries (nuclear, chemical, etc.) and the armed forces. She then goes on to look at the lack of training which contributed to other major disasters such as the Scandinavian Star fire, the Bradford fire, Heysel Stadium, Hillsborough Stadium and others.

Fig 1
Chapter 2 explores selection of incident commanders and Chapter 3 describes (in some detail) the training of incident commanders in different workplaces. Chapter 4 looks at the stress of incident command which I discuss a bit later in this post. Chapter 5 explores command decision making, which deserves a whole post for itself (stay tuned). Chapter 6 looks at incident command teams (including high performance "dream teams") and has an excellent diagram of a model of command team performance (Fig 1). Flin mentions the need for a shared mental model, a term I often use in debriefing but importantly contrasts this with "groupthink", where a team clings on to the wrong mental model. Flin also discusses whether everybody needs to know "the big picture" and this is certainly something that is of interest in the operating theatre. As people arrive intermittently to help out with a critical incident who should update them and does every new team member need to know the whole story? Chapter 7 is entitled "Conclusions and future directions". Here Flin concludes that the best leaders can diagnose a situation, have a range of leadership styles that they can adopt and then match the correct style to the situation.


Fig 2
One critique I have of the book is the perhaps unnecessary complexity of some of the diagrams (Fig 2 shows us, amongst other things, the location of the VGA to composite video encoder in a control room) and the over-description of some of the training installations such as: "This has a central control room, a tactical and action room, which can accommodate 24 observers as well as the rescue leader, and three smaller team rooms." (p.78) Flin also provides a lot of information on individual courses which, although interesting, can at times become overly descriptive.


I think the main lesson I derived from Flin's book concerns the role of stress on incident commanders. In the Piper Alpha explosion mentioned at the beginning of this post, the OIM is in a state of panic, people are shouting at him and he's not responding. The Hillsborough chief superintendent "froze" (p.30). One of the two vital attributes of a leader according to World War II Field Marshall Montgomery is "calmness in crisis" (p.40)  Flin refers to a competence assessment of OIM (p.55) which includes an ability to "Deal with stress in self and others". The best chapter in the book is devoted to "The Stress of Incident Command". However, when I look around today at the various rating tools and marking systems there is neither a mention of "coping with stress" nor is there an approach to exploring stress under pressure during simulation or in "real life". It may be that, as with "communication", it is thought that the ability to cope with stress underlies all the other behaviours that we do assess and talk about (e.g. planning, situational awareness, leadership). Having been involved in a few critical incidents I can easily recall the effect of stress on some of the individuals in the team, to the extent that this was the main driver in terms of loss of communication and prioritisation. I would therefore like to see more of a focus on stress in simulation and the exploration by simulation faculty of ways of dealing with stress by candidates. Flin mentions taking a deep breath as one example. 

To steal another quote that Flin has attributed to Montgomery: "One great problem in peace is to select as leaders men whose brains will remain clear when intensely frightened; the yardstick of 'fear' is absent." It is here where simulation can make a difference; the stressful nature of high-fidelity simulation allows us to assess our candidates' responses and behaviour. Much more importantly, it allows us to coach our candidates and promote self-reflection so that they might improve these same responses and behaviour when disaster threatens in the "real world". Flin provides a quote from Charlton (1992):
"Knowledge of the effects of stress enables the individual to take positive steps to avoid the stressors or to reduce them to limit their impact, thereby defusing a potentially dangerous situation."

However, as I mentioned above, simulation is still under-utilised within healthcare. Let's change that.

No comments:

Post a Comment