Tuesday, 30 April 2013

Book of the month: Sitting in the hot seat by Rhona Flin

Before I begin this review, I have to confess something: Dr Flin is one of the greats in human factors research and my writing a review of one of her books may seem hubristic. I therefore appeal to you (kind reader) to take all comment in the spirit in which it is meant, as a subjective appraisal by a neophyte of an expert's work.

Flin's book focuses on the individual (pilot, police officer, manager) who leads the on-scene response to an emergency. She also provides an overview of how different services train and assess these individuals. At the same time, Flin makes suggestions on how to improve the preparedness of these individuals. Flin's book was published in 1996, almost twenty years ago, and although aviation had been making strides with Cockpit (later Crew) Resource Management (CRM), the lessons from the flightdeck had only been taken up by other professionals in a haphazard manner. 

Flin's first chapter starts with a quote from the Cullen report into the Piper Alpha disaster in which 167 people died:
"The failure of the OIMs [offshore installation managers] to cope with the problems they faced on the night of the disaster clearly demonstrates that conventional selection and training of OIMs is no guarantee of ability to cope if the man himself is not able in the end to take critical decisions and lead those under his command in a time of extreme stress."
It is possible that only a few men died in the initial explosion; the majority of the remaining crew followed standard procedures and gathered in the accommodation block to await further instructions from the OIM, which never came.

Later on in chapter one, Flin provides further quotes from the Cullen report which give us insight into the actions of the OIM:
"The OIM had been gone 'a matter of seconds when he came running back' in what appeared... to be a state of panic..."
"One survivor said that at one stage people were shouting at the OIM and asking what was going on and what procedure to follow. He did not know whether the OIM was in shock or not but he did not seem able to come up with any answer."
The Cullen report led to a focus on safety management systems and on the process of OIM training and selection. The report called for regular training of OIMs in emergency exercises which would allow them to practice decision-making in a stressful environment.

The parallels with my own job as an anaesthetic consultant are clear: 99% of my work is straightforward and routine, 1% of the work is crisis management requiring rapid action to prevent patient harm or death. In 1990 Cullen realised that the only way to make sure that the OIMs were prepared for the 1% was to practice and simulate. Although we have made a start in healthcare and in many respects Anaesthesia is ahead of the game, we still today do not practice for rare events frequently enough.

In the remainder of chapter 1, Flin supplies definitions and provides an overview of the incident command and control procedures in the emergency services, hazardous industries (nuclear, chemical, etc.) and the armed forces. She then goes on to look at the lack of training which contributed to other major disasters such as the Scandinavian Star fire, the Bradford fire, Heysel Stadium, Hillsborough Stadium and others.

Fig 1
Chapter 2 explores selection of incident commanders and Chapter 3 describes (in some detail) the training of incident commanders in different workplaces. Chapter 4 looks at the stress of incident command which I discuss a bit later in this post. Chapter 5 explores command decision making, which deserves a whole post for itself (stay tuned). Chapter 6 looks at incident command teams (including high performance "dream teams") and has an excellent diagram of a model of command team performance (Fig 1). Flin mentions the need for a shared mental model, a term I often use in debriefing but importantly contrasts this with "groupthink", where a team clings on to the wrong mental model. Flin also discusses whether everybody needs to know "the big picture" and this is certainly something that is of interest in the operating theatre. As people arrive intermittently to help out with a critical incident who should update them and does every new team member need to know the whole story? Chapter 7 is entitled "Conclusions and future directions". Here Flin concludes that the best leaders can diagnose a situation, have a range of leadership styles that they can adopt and then match the correct style to the situation.


Fig 2
One critique I have of the book is the perhaps unnecessary complexity of some of the diagrams (Fig 2 shows us, amongst other things, the location of the VGA to composite video encoder in a control room) and the over-description of some of the training installations such as: "This has a central control room, a tactical and action room, which can accommodate 24 observers as well as the rescue leader, and three smaller team rooms." (p.78) Flin also provides a lot of information on individual courses which, although interesting, can at times become overly descriptive.


I think the main lesson I derived from Flin's book concerns the role of stress on incident commanders. In the Piper Alpha explosion mentioned at the beginning of this post, the OIM is in a state of panic, people are shouting at him and he's not responding. The Hillsborough chief superintendent "froze" (p.30). One of the two vital attributes of a leader according to World War II Field Marshall Montgomery is "calmness in crisis" (p.40)  Flin refers to a competence assessment of OIM (p.55) which includes an ability to "Deal with stress in self and others". The best chapter in the book is devoted to "The Stress of Incident Command". However, when I look around today at the various rating tools and marking systems there is neither a mention of "coping with stress" nor is there an approach to exploring stress under pressure during simulation or in "real life". It may be that, as with "communication", it is thought that the ability to cope with stress underlies all the other behaviours that we do assess and talk about (e.g. planning, situational awareness, leadership). Having been involved in a few critical incidents I can easily recall the effect of stress on some of the individuals in the team, to the extent that this was the main driver in terms of loss of communication and prioritisation. I would therefore like to see more of a focus on stress in simulation and the exploration by simulation faculty of ways of dealing with stress by candidates. Flin mentions taking a deep breath as one example. 

To steal another quote that Flin has attributed to Montgomery: "One great problem in peace is to select as leaders men whose brains will remain clear when intensely frightened; the yardstick of 'fear' is absent." It is here where simulation can make a difference; the stressful nature of high-fidelity simulation allows us to assess our candidates' responses and behaviour. Much more importantly, it allows us to coach our candidates and promote self-reflection so that they might improve these same responses and behaviour when disaster threatens in the "real world". Flin provides a quote from Charlton (1992):
"Knowledge of the effects of stress enables the individual to take positive steps to avoid the stressors or to reduce them to limit their impact, thereby defusing a potentially dangerous situation."

However, as I mentioned above, simulation is still under-utilised within healthcare. Let's change that.

Thursday, 18 April 2013

Experts and their unknown knowns

On the 12th February 2002, Donald Rumsfeld uttered his most memorable words as US Secretary of Defence:
"Reports that say that something hasn't happened are always interesting to me, because, as we know, there are known knowns; there are things we know that we know. There are known unknowns. That is to say, there are things that we now know we don't know. But there are also unknown unknowns. There are things we do not know we don't know" (1)
Wikipedia has a nice article on this statement, including the missing pair: unknown knowns. Rumsfeld does not mention it, perhaps because there is some confusion about what an "unknown known" is. The Wikipedia article calls it "the most dangerous type of unknown" but then (at the time of writing and with an understanding that Wikipedia is constantly edited) goes on to make a bit of a hash trying to explain it. There is a suggestion that an "unknown known" is the claim that weapons of mass destruction existed in Iraq, or the Abu Ghraib scandal or things we refuse to acknowledge that we know.

Personally I prefer to think of "unknown knowns" as things we didn't know we knew. This is what Dörner, in his book Logic of Failure, is talking about when he says he knew a doctor who could diagnose a disease with great certainty but this doctor didn't know how he did it. Dörner explains it as a type of intuition and goes on to comment that experts often display this type of intuition. This integrates very nicely with a 4-stage framework (first developed by Noel Burch) for looking at expertise:
  1. Unconsciously incompetent: unknown unknown (The total beginner who has no idea what he doesn't know and is oblivious to the breadth and depth of possible knowledge. This is a dangerous place to be and a scary person to be looking after you in an emergency) 
  2. Consciously incompetent: known unknown (The novice who has become aware of how little he/she knows. A somewhat less dangerous place to be and now, instead of having a scary person looking after you, you have a scared person looking after you.)
  3. Consciously competent: known known (The journeyman who knows a lot but has to spend a lot of time thinking about what he/she is going to do.)
  4. Unconsciously competent: unknown known (The expert who has a sixth sense about things which are about to go wrong or can stop a situation from escalating without telling you how they knew what to do.)
In many ways then the extremes of the framework are the most untroubled places to be; at one end ignorance is bliss and at the other end ignorance is due to the achievement of expertise.

This 4-stage competence framework allows us to see how a learner may progress and implies that moving through the 4 stages is beneficial without drawbacks. However, later on in his book Dörner goes on to explain how more information (loss of the unknown) may be detrimental:
"Anyone who has a lot of information, thinks a lot, and by thinking increases his understanding of a situation will have not less but more trouble coming to a clear decision... We realize how much we still don't know, and we feel a strong desire to learn more. And so we gather more information only to become more acutely aware of how little we know..."(p. 99)

In terms of how the framework relates to simulation-based education, I would like to think that it can help us understand at which stage a participant is. This knowledge should allow the course faculty to tailor the course to the participant. I would also like to think that simulation lends itself well to the little-known 5th stage of competence:
  1. Consciously aware of unconscious competence: known unknown known (An ability to reflect on and examine the behaviours and actions one is carrying out as an expert.)
The Elaine Bromiley case involved a number of experts in anaesthesia and ENT surgery who failed to do the right thing. Partly due to an underdeveloped final stage of competence, these experts were not able to reflect in situ  and realise what the correct sequence of actions should be. Simulation with debriefing allows experts to watch how they make mistakes and learn to develop mechanisms for preventing them. Unfortunately it is experts (consultants) who we see least often in the simulation suite as participants and this needs to change.