Monday 23 November 2015

Soaring with eagles or swimming with sharks? Aviation, banking and healthcare

"So, I should sell my shares in solar?"
When Steven Shorrock (@StevenShorrock),  Safety Development Project Leader at EUROCONTROL, spoke at ASPiH 2015, he mused aloud whether the conference organisers might have been better to invite a speaker from the banking sector than from air traffic control. He recommended reading "Swimming with sharks" for an insight into the former and these are my thoughts after reading it.

The book, "Swimming with sharks", published in 2015, is based on interviews with about 200 people who worked or recently worked in banking in London. The author, Joris Luyendijk, discusses the financial crash of 2008, which almost brought the developed world to its knees, as well as other scandals such as Libor trading, laundering of drugs money, assisting tax evasion and the fall of Lehman Brothers. The real substance of the book lies in the personal accounts of the people working in banking and their thoughts about the sector. There are definite similarities between healthcare and banking but also some important differences... 

Similarities

Safety
In banking, compliance officers are meant to keep an eye on the traders and ensure that trades did not expose the bank to too much risk. However, as Luyendijk states: "...it is very difficult to put a number on a loss you have averted by saying no..." A football analogy may be apt. The goalkeeper is remembered less for the many saves than for the single goal which lost her team the game. In healthcare a similar effect may be seen with those who work in "patient safety" including simulation-based training. Because safety is a "dynamic, non-event", interventions aimed at improving safety may be seen by both front-line staff and hospital board members as costly and time-consuming with little return on investment. 

Culture and sub-cultures
Although there is much talk in the NHS of changing "the culture". Shorrock suggested that healthcare may be more similar to banking in this respect too, with a plethora of "sub-cultures" rather than one over-arching culture. "Swimming with sharks" details a few of these sub-cultures. The  dog-eat-dog world of the front office traders compared to the investment bankers in asset management, who are encouraged to make slow, deliberate moves. Luyendijk says that banks like to portray themselves as organisations organised like an army or an airport, but in fact they are "clusters of islands in the fog, staffed by mercenaries" (p.145). It may be suggested that the culture(s) of hospitals can approximate those of aviation, although distinct in terms of values and behaviours the over-arching goal is the same. When hospitals fail, their cultures resemble those of banks, each sub-culture putting its own interests first resulting in a disconnect between frontline staff and the Board.

"Culture" is also high on the investigation list of a Holmes-like consultant who helps banks prevent and detect rogue trading (p.145). Luyendijk quotes him: "Is this a place where somebody can raise his hand and say I made a mistake?" In healthcare too the culture of a specific department or ward will give you an insight into how safe it is for patients. Dysfunctional teams are unlikely to be safe teams.

Lastly, one of the interviewees states: "It's not the people who are bad: it's the culture..." The same may be said of healthcare. Individuals are expected to adapt to, adopt and accept the given culture within a unit, department or ward. Luyendijk, in his recommendations at the end of the book, says: "One thing I believe does not help is to reduce the problems with global finance down to individual character flaws... if you blame all the scandals as well as the crash on individuals you imply that the system itself is fine, all we need to do is to smoke out the crooks..." This "bad apple" theory is well-addressed by Sidney Dekker and others. In fact the system/culture encourages certain types of behaviour and without changing the fundamental causes we are unlikely to see long-lasting change.

Hierarchies and silos
Luyendijk refers to Gillian Tett, a Financial Times journalist, who argues that "the number-one problem in investment and megabanks is that everyone works in 'silos'". A compliance officer Luyendijk interviewed says: "We need to get rid of the idea of "the bank". That term implies a unity of action and purpose, as if there's an all-encompassing view driving the bank. There is no such thing. What we have is a collection of individuals in positions of power. Each of them manages his or her own world." The same personalities can be found in healthcare, with the clinical lead or general manager who looks after "their world" without reference to the needs of the wider organisation.

Complexity
Some financial products have become so complex that very few people, even inside the bank, actually understand what they represent or how risky they are. "Even the risk and compliance people who were supposed to be our internal checks and balances... We had to teach them how to monitor us" (p.132). One of the causes of the crash of 2008 was that the complex financial products on offer, collateralised debt obligations (CDOs), although AAA-rated by ratings agencies, were filled with mortgages which people were not going to be able to repay. A similar level of complexity operates in healthcare. A Board which is not pro-active in understanding the everyday workings of the hospital risks thinking the place is much safer than it actually is. In addition, decisions made by the board (with the best of intentions) may have significant, unexpected implications on the shop floor.

Lack of funding in IT
Handwriting: Lantus 8 Units misread as 80 Units
"Your readers would be shocked if they realised just how crap the IT organisation is in many banks..." (p.141). The same is true in healthcare. Whether it is "the biggest IT failure ever seen" costing £10 billion of UK taxpayers' money or the fact that most hospital drug charts are still hand-written, IT investment in healthcare is lacking. This means that, as in banking, many systems don't "speak" to each other, multiple passwords are required for multiple systems and patients' notes are still folders where pages may go missing. The potential for errors is phenomenal.

Short-termism
In part because of the risk of instant dismissal (see below, under Differences) bankers often have a short-term outlook (p.154). In the NHS, acceptance of a post is often "for life", particularly at the patient-facing end. However there are perhaps similarities with NHS board members where turnover is much more rapid. This means that the long-term effects of some decisions are not made obvious to board members who have moved on.

Speaking up
A number of interviewees spoke of the futility of speaking up when witnessing poor practice. "I'd lose my job never to find a new one anywhere in the City. Meanwhile nothing would have changed" (p.188) The same can be said of healthcare, the fate of Stephen Bolsin, who spoke up about the Bristol heart surgery deaths, is not unique.

Differences

Focus
The focus of bankers and banks is to make money. Bankers want to make money for themselves and, in a meritocratic system, are rewarded for how much money they make for the bank. Public healthcare, as is generally found in the NHS (although the the English NHS seems to be on the road to privatisation) instead seems to be more focused on not losing money rather than trying to make a profit. This means that the majority of healthcare workers do not have a vested interest in increasing throughput or reducing costs.
Dismissal
In banking, dismissals are unexpected and immediate. To prevent the newly unemployed trader from damaging the bank, all network access is revoked and they are escorted from the building. In healthcare it is much more difficult to dismiss employees, even when their behaviour seems obviously unacceptable. Although at first glance it may seem preferable to be able to instantly dismiss "bad apples" or under performers, the effect this has on banking is not negligible. Bank employees may have much less loyalty to their bank and therefore may care little what effect their risk-taking has on "their" bank (e.g. Nick Leeson and Barings Bank)

Professionalism
The "professional" doctor vs the "professional" banker
In the City, the biggest compliment is 'professional'. "It means you do not let emotions get in the way of the work, let alone moral beliefs" (p.107). A recurring theme in Swimming with Sharks is that bankers are not immoral, but rather amoral. As long as an action is legal it is irrelevant whether it is "right" or "wrong". In healthcare the word "professional" may still mean a number of different things to different people, but what it most certainly does not mean is "amoral".


Final thoughts

Was Steven Shorrock right? Is healthcare more like banking than aviation? In effect it can be like both. At its best, healthcare resembles aviation with its focus on safety and a desire to ensure that the passenger/patient gets to his destination/home in one piece. At its worst, healthcare resembles banking, with a climate of fear (p. 95), amorality, back-stabbing and a focus on money and targets (cf. Mid-Staffs). It is up to all of us to decide what kind of ward, department and hospital we want to work in. In particular it is the job of hospital leadership to foster a safety culture and sell the idea of "the hospital" or "the healthcare centre" that staff can be loyal to and work together for. Hospital leaders must also speak truth to power by making it clear that the current cost-cutting across the NHS cannot be making the organisation safer. It is the job of politicians to provide the funding required and to protect the nascent safety culture against accusations of "blunders" as more adverse events are reported.

As for banking, Luyendijk makes a convincing argument that nothing substantial has changed. The basic weaknesses which almost brought the developed world to a standstill in 2008 remain in place. Unfortunately the political willpower to make the required root and branch reform is lacking.



Acknowledgments: 

"Swimming with sharks" does not make a single reference to healthcare or human factors. I am unlikely to have read it without Steven Shorrock's recommendation for which I am grateful.

Wednesday 28 October 2015

Book of the month: Human Factors and Behavioural Safety by Jeremy Stranks

About the author

According to the book's blurb Jeremy Stranks "has 40 years' experience in occupational health and safety enforcement, management, consultancy and training." Stranks is the author of a number of books on health and safety, including "The Handbook of Health and Safety Practice" and "Stress at Work: Management and Prevention".

Who should read this book?

The book will be of use to simulation centre directors and managers. Specific chapters may be interesting for others involved in simulation-based medical education.

In summary

The book consists of 19 chapters:

  1. Human behaviour and safety
  2. Human sensory and perceptual processes
  3. Organizations and groups
  4. People factors
  5. Perception of risk and human error
  6. Organizational control and human reliability
  7. Improving human reliability
  8. Ergonomic principles
  9. Ergonomics and human reliability
  10. Principles of communication
  11. Verbal and nonverbal communication
  12. Written communication
  13. Interpersonal skills
  14. Systematic training
  15. Presentation skills
  16. Health and safety culture
  17. Change and change management
  18. Stress and stress management
  19. The behavioural safety approach

What’s good about this book?

How many criteria does your sim centre/programme meet?
Stranks provides good descriptions of theories which may be unfamiliar to healthcare professionals. Herzberg's two-factor theory of job (dis)satisfaction (p.10) argues that the basic needs of employees, "hygiene factors", need to be met before job satisfaction can be improved through "motivators". For example, an employee is unlikely to be satisfied with having a challenging job if her supervision  and working environment are poor.

McGregor's Theory X and Theory Y (p.73) are also explored. Theory X says that people don't like to work and will not work unless coerced. Theory Y says that people will work if they are provided with the right environment in which their inherent motivation will emerge. Stranks also provides an overview of other concepts more familiar to simulation and human factors personnel such as Rasmussen's model of behaviour (p.123), error classification (p.127), the Swiss cheese model (p.130) and others.

Stranks provides a good overview of the elements and implementation of a behavioural safety programme including significant workforce participation, a data-driven decision process and peer-to-peer monitoring (p.28-29). He also drives home the need for "clear and evident commitment from the most senior management downwards, which promotes a climate for safety..." (p. 93) A need which is evident (and largely unmet) in healthcare.

Hale and Hale (1970)
Stranks describes accident prevention strategies and classifies them according to whether they are pro-active or reactive. Proactive strategies include "safe place" and "safe person" (p. 43). This concept may also be applied to healthcare. The safe place aims to ensure that the the premises, the equipment, the processes, etc. are safe. The safe person refers to behaviour, vulnerable people (e.g. those lacking in experience) and personal hygiene (e.g. hand washing).

A number of chapters are of interest to simulation faculty and those involved in research, including the chapter on risk perception. Simulation faculty may find that Hale and Hale's model of human performance in relation to accident causation (p. 112) could provide a structure to a debrief analysis.

What’s bad about this book?


The lack of referral to references makes the book more difficult to read than it need be. For example, on page 15 Stranks states: "Most people can only take in and retain 3.1 'bits' of information at any one time." This is probably a reference to Miller's seminal "The magical number seven, plus or minus two: Some limits on our capacity for processing information". However Miller's paper refers to 3.1 bits only for some types of data, such as "hue" and "pitch and loudness". A similar problem occurs on p.26 when Stranks provides a (long-winded) definition of human factors. It is unclear if it is his own or from elsewhere.

Stranks talks about some concepts (e.g. task fixation, alarm fatigue) without referring to their titles. This makes it more difficult for the novice to link Stranks' writing with prior knowledge. Some of the concepts are poorly explained (such as fault tree analysis (p.40) and the total working system (p.213)) and occasionally the Figures are unclear (e.g. Figure 7.1, p.169). Some concepts are superficially covered but then not linked to anything else (e.g. Learning styles, p.174) and the chapters could generally have better introductions to show the logical flow of argument/idea. 

Stranks uses human factors in the plural: "What are human factors?" (p.90) and singular: "Human factors has an important role..." (p.100). He uses the term "ergonomics" to mean the scientific discipline. While this may be purely semantic, it would probably be clearer to define the terms and then stick to those definitions.

Stranks states that "The ultimate objective (for engineers) is to design equipment which requires the least physical and mental effort on the part of the operator" (p.208). One could argue that this is not true. The equipment should probably require just enough mental effort to keep the operator "in the loop" and engaged.

Stranks argues that "The use of posters... repeating a specific message are important features of the safety communication process" (p.275). This is argued against by a number of human factors experts including Terry Fairbanks (see urinal pic).

Lastly, the entire chapter on Presentation skills (chapter 15) should be skipped. If this is a problem then there are much better books out there such as "Talk Like Ted".


Final thoughts

The entire contents of Stranks' book will not be of interest (or use) to the majority of people working in simulation-based medical education. However it may be of use to managers and directors and to people involved in clinical human factors. In addition, some chapters may be of interest to a wider audience and therefore a glance at the chapter headings may be worthwhile. Reading it with a "clinical" mindset, one can appreciate that the progression in safety management systems, the changes in culture required, and the elements and implementation of a behavioural safety programme are, with minor modifications, relevant to the healthcare environment.


Friday 11 September 2015

Breakfast at Auchrannie’s (Or: How bad systems can make good people perform poorly) (by M Moneypenny)

Recently the family and I were lucky enough to be able to spend a few days at Auchrannie Spa and Resort on the isle of Arran. I would recommend both Arran and the resort to anyone. It has won a slew of awards and, according to trip advisor, is the #2 hotel in Brodick. However, goings-on during breakfast compelled me to write a blogpost…
A small selection of the awards

The problem with vegans

We are vegan which I had informed the hotel of weeks before, during the booking process. I received a lovely email in response which stated: “I have emailed the restaurant manager with regards to your request for vegan sausages.” On arrival at the breakfast buffet we were greeted by a very pleasant maître d’ who made sure we hadn’t just wandered in off the street, found us a table, and told me to talk to the waitering staff about the dietary requirement.

So we availed ourselves of the continental breakfast and then had chat with Sean who was looking after the hot food part of the buffet. Sean told me that they did have vegetarian sausages but that he thought they weren’t vegan. He said that he seemed to remember asking the chefs a while ago and that they had told him this, but that he would enquire.

Sean then went through to the kitchen and had a chat with one of the chefs. After a little while he came back and told us that the sausages were in fact vegan. Great, we said, we’ll have three breakfasts please. Sean said: “Two?” And we said: “No, three, one for each of the adults and one to share between the kids.” It would take a wee while to make he informed us, as they would cook everything fresh.

We sat back down and waited. And waited. And waited a little bit more. Then a friendly waiter called Will caught my eye and asked if we were okay. I told him we were waiting for our vegan breakfasts. Will said he would see what was happening. Unfortunately for him the swinging door into the corridor next to the kitchen has a clear glass window in it. This allowed me to see what happened next. Will walked through the door, looked into the kitchen, waited a little bit without speaking to anybody then turned around and came back to tell us that they were almost ready.

Great. So we waited. And waited. And waited a little bit more. I took the kids over to the play area while my other half went to find Sean. Sean was very apologetic. He went into the kitchen to find out what was happening. He came back and informed us that the breakfasts hadn’t even been started yet. He had only talked to and asked (he said) one of the chefs to make the breakfasts and because he hadn’t written the order down they hadn’t done anything. Sean apologised profusely and said he would be back with our breakfasts. About 5 minutes later there he was with 2 plates which we gave to my better half (it was her birthday after all) and the kids. Sean wandered off. He didn’t come back. A few minutes later we managed to call him over and ask him about my breakfast and he said he thought we’d only wanted two and we said, no, three. Sean then came back a few minutes later with a single sausage on a plate…

The following day things went much smoother, there was no maître d’ but Sean welcomed us, sat us down and brought us three breakfasts.

Good people in a bad system

Other than being a somewhat boring story from my holiday (at least I’m not making you sit through holiday photos) what is the point of this blogpost? One major learning point for me is that even very caring people, who want to do the right thing, can be let down by the system. What improvements could be made?

  • There were more than enough waiting staff to allocate them specific tables. This would mean that “our” waiter/waitress would know we had been waiting longer than we should have been. The current system was chaotic with tables cleared ad hoc, sometimes one waiter would get the cleaning spray out, leave it on the table to do something else then another waiter would clean the table.
  • If you take on a “problem” (and I’m using that term to describe us) then you own it until you have passed it on to someone else. We were Sean’s problem and he should’ve kept an eye on us.
  • Empower your staff. I have no idea why Will didn’t actually speak to anybody in the kitchen, but he did recognise that something was amiss and he could have flagged up with the chefs that a table was awaiting a vegan breakfast.

The final give home message is that the staff at Auchrannie are some of the most pleasant and courteous I have ever met. However, they were let down by the lack of coordination at breakfast. The same can be true of healthcare, excellent staff working in a faulty system can still result in disappointed patients. (Names have been changed to protect the innocent) 

Friday 21 August 2015

"They did too well"

When observing a new faculty member it is not unusual to see a look of relief on his/her face when the participants in a scenario (finally) make a mistake. The faculty member may believe that if no mistakes are made then the facilitator will have nothing to talk about in the debrief. Below are a few tips on how to deal with the participants who "did too well".

Don't create a special crisis

Some may be tempted to throw a curveball into the scenario. "They're doing great, okay... Your patient has now arrested and he's also aspirated." Try and avoid this. Your scenario should be running to your learning objectives. Creating a special crisis in order to have something to talk about in the debrief, means they're going to be talking about the crisis and not your learning objectives.

It's not you, it's them

The introductory paragraph contains an obvious mistake: "the facilitator will have nothing to talk about in the debrief". The debrief is not an opportunity for the facilitator to talk. The debrief allows the facilitator to facilitate the discussion the group is having. This means that the faculty member should concentrate on how to make sure the learning objectives get discussed, not on whether the participants did or didn't do well.

Good scenarios are not designed to create mistakes

Good scenarios are designed to explore performance based on the learning objectives of your course, some will do well, others less well. All performance can be discussed. The words of Peter Dieckmann and Charlotte Ringsted are worth remembering:
"Learners' errors should not be seen as a personal victory in scenario design and implementation." (p.55 - Essential Simulation in Clinical Education (Forrest, McKimm and Edgar (eds)))


Be enthusiastic and explore

Although "advocacy and inquiry debriefing" may have its faults (see blogpost here), its appeal to the facilitator to display genuine curiosity is a valid point. When the participants "did too well", why did that happen? What was their communication, leadership, teamwork, etc. like? How can we ensure that the next group of participants will do just as well?


Final thoughts

The desire to see participants make mistakes is a phase in the evolution of the facilitator. Most move beyond it, happy in the knowledge that good performance is a fertile ground for discussion as much as poor performance is.


Sunday 26 July 2015

When the equipment fails...

On 18th July 2015 Martin Bromiley tweeted:


The implication is that one should not pretend that something hasn't failed in a sim session. There are a few points for reflection here.

1) There are considerable differences between aviation and healthcare sim

Airline pilots have much more exposure to simulation than the average healthcare worker, this means that equipment failure in aviation sim can be addressed by rescheduling the session. This is not normally the case in healthcare where a given participant may only be able to take part in a sim session every three years.
Aviation sims are much better funded than healthcare sims. In healthcare the use of out-of-date drugs and second-hand equipment is the norm. Equipment failure is therefore more likely in healthcare.
Healthcare sims also tend to involve the use of a plethora of equipment from different manufacturers and "cobbled-together" pieces of kit such as a simulated blood gas machine or a simulated X-ray machine. These are more likely to fail than bespoke flight simulators.
The bottom line? Aviation sims are less likely to fail and when they do, the ability to reschedule a sim session means that equipment failures can be "explored" to see how the participants cope with an unexpected problem.

2) There are different types of equipment failures

One of the most common types of failure in (mannequin-based) simulation is the mannequin itself. Loss of power or communications with the controlling device can mean the mannequin "dies". Other equipment failures may mean that, for example, one cannot feel a pulse on one arm, or that the pupils don't dilate, or that the simulated blood gas machine stops working. The faculty response to each of these equipment failures will be different and this brings us on to point 3.

3) Response to equipment failure depends on the type of failure, faculty experience, the scenario and your learners

If we take sudden mannequin failure as an example, and three different scenarios:
  1. Patient in septic shock, hypoxic, hypotensive and moribund
  2. Patient with life-threatening asthma, silent chest and tiring
  3. Patient about to undergo elective surgery for laparoscopic cholecystectomy, chatting to anaesthetist
In the first two it would seem reasonable to continue the scenario, with a pulseless, lifeless patient while you try to re-establish connection to the mannequin (or plug him back in). In the third scenario, it would be best to interrupt the scenario, acknowledge a technical issue and fix it.

Of course, this still doesn't cover the "pretend it hasn't failed" situation. Imagine a home-made "X-ray machine" which displays an X-ray at the touch of a button. However, when the confederate radiographer goes to display the X-ray nothing happens. One option would be to get the participants to decide what they would do in such a situation in real life, e.g. get another X-ray machine, continue on clinical judgment, auscultate chest, ultrasound, etc.. Another option would be to "pretend it hasn't failed" with the confederate providing them with a hard-copy of the X-ray. This latter option may be particularly apt if the rest of the scenario depends on the ability of the participants to correctly interpret the X-ray.

Final thoughts

This post should not come across as a carte blanche to make up for poor equipment maintenance or scenario planning. The "pretend it hasn't failed" response should be rare and limited to minor failures which will not throw the participants out of the simulated reality you have created for them (e.g. "I wasn't sure when he stopped breathing that he had really stopped or that you wanted us to pretend that he hadn't.") "Pretend it hasn't failed" is not the correct wording even when that is what you want the participants to do; well-trained faculty and confederates will be able to sculpt the scenario so that the equipment failure is quickly forgotten. "Pretend it hasn't failed" is also not the correct response when you are carrying out in situ systems-testing; the participants should deal with this as they would in real life. Lastly, if it's a course and the scenario learning objectives can still be achieved when the equipment has failed then, by all means, the participants should be allowed to develop their own solution to the problem. As faculty experience (and expertise) increases, one will become better at predicting the likely consequences of a failure and the best response.

Wednesday 1 July 2015

Two is a crowd

A team may, very loosely, be defined as two or more people working towards a common goal. The benefits of working in a team are manifold: shared physical and mental workload, balancing of strengths and weaknesses, error trapping, etc. More accurately, the preceding sentence should be modified to say "The benefits of working in a good team are manifold." We have all had experience of dysfunctional teams which were much less than the sum of their parts, and would probably have functioned better if the individuals had worked independently. As Reason and Hobbs state: "Team management errors are one of the most serious threats to safety... problems include":

  • team leaders being over-preoccupied with minor technical problems
  • failure to delegate tasks and responsibilities
  • failure to set priorities
  • inadequate monitoring and supervision
  • failures in communication
  • failure to detect and/or challenge non-compliance with SOPs
  • excessively authoritarian leadership styles
  • junior members of the crew or team being unwilling to correct the errors of their seniors

Although much depends on effective team members, the above list suggests that good leadership is paramount. Ironically, the education system from primary school on to postgraduate education praises and rewards individual excellence. This means that the A+ students who have become excellent personal achievers are then expected to work in, and lead, teams with very little prior preparation for this role. Although courses, such as Advanced Life Support (ALS), expect candidates to show leadership skills, the team members are often faculty members and have to be spoon-fed instructions. Ostensibly this allows candidates to be assessed on their skills, without variable support from the team, but it creates unrealistic scenarios.

It is perhaps not surprising for the director of a simulation centre to suggest that simulation is part of the solution to team training. However, this is one of the greatest benefits of inter-professional simulation, whether in situ or in the simulation centre. Repeated practice with a focused debrief allows, some might say forces, teams to become more effective. There is still too much expectation within healthcare that competent individuals, when placed in the same room, will work well together. Unfortunately this is not the case. And practicing on an ad hoc basis with real patients is not only unethical but also ineffective; the lack of time for a debrief and the lack of uninvolved observers makes learning from real patients difficult. So, could your team could be practicing working together in a simulation centre (or in situ)? And if your team isn't doing this then how do you justify poor performance in real cases?

Wednesday 10 June 2015

Book of the month: Essential Simulation in Clinical Education (Forrest, McKimm and Edgar (eds))

Disclaimer: The reviewer (M Moneypenny) co-authored a small section of a chapter in this book. He is also a friend of a number of the contributors (but has tried to write an objective piece!)

About the editors
Kirsty Forrest is Professor and Director of Medical Education at the Australian School of Advanced Medicine, Macquarie University, Sydney, Australia. Judy McKimm is Professor of Medical Education and Director of Strategic Educational Development at Swansea University, Swansea, UK. Simon Edgar is Director of Medical Education, NHS Lothian, Edinburgh, UK and Education Coordinator, SCSCHF, Larbert, UK. The three editors combine a significant amount of expertise in medical education, simulation  and clinical practice.

About the contributors

There are 32 contributors (not including the editors), one from Ireland, one from New Zealand, 2 from Canada, 3 from the USA, 4 from Denmark and 21 from the UK. Although perhaps "UK-centric", the geographical spread results in a more diverse authorship than, for example, "Practical Health Care Simulations". 

Who should read this book?

The back cover states: "A superb companion for those involved in multi-disciplinary healthcare teaching, or interested in health care education practices…" In reality the book should probably be on the reading list for anyone who is starting out in simulation-based (medical) education. More experienced educators may find specific chapters, aligned with their own interests, of relevance.

In summary

The book is divided into 14 chapters:

  1. Essential simulation in clinical education
  2. Medical simulation: the journey so far
  3. The evidence: what works, why and how?
  4. Pedagogy in simulation-based training healthcare
  5. Assessment
  6. The roles of faculty and simulated patients in simulation
  7. Surgical technical skills
  8. The non-technical skills
  9. Teamwork
  10. Designing effective simulation activities
  11. Distributed simulation
  12. Providing effective simulation activities
  13. Simulation in practice
  14. The future for simulation

What's good about this book?


Every chapter starts with an overview and concludes with a short summary and an even shorter "key points" which is useful both as a reminder and as a reference for the reader in order to decide if the entire chapter is worth reading.

The editors make it clear in Chapter 1 that simulation is not a panacea. This view has been echoed elsewhere by Andrew Buttery (@andibuttri) on Twitter and by Trisha Greenhalgh (@trishgreenhalgh) in the British Journal of General Practice. Simulation is not magic and not all simulation is "good" simulation. "High fidelity" is also placed into context by Tom Gale and Martin Roberts who state: "…the blind use of the highest fidelity available is a principle which should be avoided" (p.63).

Chapter 3 "The evidence: what works, why and how?" by Doris Østergaard and Jacob Rosenberg is essential reading for all who are involved in designing simulation-based interventions and those undertaking research. They consider the features which make simulation effective, including feedback, deliberate practice and curriculum integration and they also look at some of the challenges faced by researchers in simulation.

What's bad about this book?

The order of the chapters could be reconsidered. The chapters on designing and providing effective simulation activities would logically appear nearer the beginning of the book, and the chapter on assessment nearer the end. In addition, there is a chapter on "Teamwork" but not one on "Leadership" (although one might argue that good leadership is part of good teamwork). Lastly, an entire chapter on distributed simulation (DS), although interesting, is probably not required in a book covering "essential" simulation. A section on DS could have been included in the "Simulation in practice" chapter.

Although overall a very good chapter, Chapter 5 "Assessment" by Thomas Gale and Martin Roberts refers to "assessment tools with appropriate reliability/validity…" (p.61). However the tools themselves are not inherently reliable or valid, the scores produced by the use of the tools may be, and then initially only within the context in which the tool was created.

Final thoughts

This book covers the important topic areas well, including assessment, the roles of faculty and how to create effective simulations. This book therefore deserves a space on every simulation centre's bookshelf, as it provides a good overview of practical simulation in a digestible format.

Wednesday 27 May 2015

Affordances and constraints

In human factors, one of the areas of interest is human-object interaction. Some objects are extremely easy to interact with, often because they have been designed from the beginning with the human user in mind. Examples might include the iPhone (other smartphones are available), and Dyson vacuum cleaners (other vacuum cleaners are available). Other objects can be more difficult to work out. Anybody who has had to fold up a child's buggy or change the time on an oven clock knows what I'm talking about.

In human-object interaction, an affordance describes the actions that a human can readily perceive are possible (Figure 1). Well-designed objects make it readily apparent how they should be used. For example, looking at two LEGO® bricks it is pretty clear that they are made to stack on top of one another.
Figure 1: The handle on a coffee mug
suggests that it can be used to hold the mug. 

constraint is a design feature which stops an undesirable action (Fig. 2). The constraint may be physical or, for more complex objects, software-driven. Using LEGO® bricks again as an example, they tend to fit together in only very limited ways.

Figure 2: Some devices require post-manufacture
constraints to be added.

In healthcare, many devices are much more complicated than LEGO®. However good design, using affordances and constraints, plays a strong part in minimising errors.

The TCI pump

The TCI pump is used to provide a target-controlled infusion of an anaesthetic (propofol) or a potent painkiller (remifentanil). The pump has a number of useful constraints to minimise errors. For example if you inadvertently press the power off button, the pump will display "LOCKED" (Fig 3) and forces the user to carry out a sequence of steps to ensure that this was the intended action. 
Figure 3: OFF button "slip" error prevented

This sequence includes two additional safety steps. An "OK" button press (Figure 4) needs to be followed by a separate "CONFIRM" button press (Figure 5), preventing an inadvertent double press. 
Figure 4: OK button
Figure 5: Confirm button
Lastly the "power off" button requires a continuous press (Figure 6).
Figure 6: Power off

Unfortunately the Alaris PK also has a few shortcomings. If one forgets to prime the pump when using remifentanil, it takes 7 minutes and 22 seconds before the pump alarms to tell you that a downstream clamp is still on. This means that there is a significant amount of time during which one may think that the pump is delivering a drug when it isn't. The video is a time-lapsed to show that after 5 minutes the pump is informing us that both the plasma concentration and effect site concentration have reached the set levels, however not a single drop of remifentanil has been delivered. A simple design change would involve the pump not allowing an infusion to be started without the pump first being primed.



The suction on the anaesthetic machine

The suction on an anaesthetic machine is used to remove body fluids such as airway secretions or gastric contents. It may have to be used in an emergency if the patient regurgitates gastric contents. On some anaesthetic machines the suction is placed next to the anaesthetic machine ON/OFF switch (Figure 7).

Figure 7: Tempting to switch the anaesthetic machine off by mistake
This means that a person may inadvertently switch off the anaesthetic machine when they meant to switch on the suction. Some anaesthetic machines allow you a grace period when you switch them off in case you have made a mistake and you can quickly switch them back on without them powering down. The anaesthetic machine pictured does not have this function. The manufacturer has instead installed a cover on the anaesthetic machine ON/OFF switch to ask as a physical constraint (Figure 8).

Figure 8: The clear plastic cap acts as a physical constraint


The value of simulation

Many devices undergo only limited testing by carefully selected end users. Very few devices are tested under stressful or crisis conditions. This means that devices can be released without ensuring that they will be used as intended by the manufacturer. Simulation could be used to test products in realistic conditions during the design stages, without the risk of patient harm.
In addition, simulation could be used to train personnel in the correct use of the equipment, ensuring that the actions are maintained under crisis conditions.
Simulation for equipment design and training is greatly under-utilised.  If manufacturers collaborated with simulation centres, their devices could integrate affordances and constraints which would minimise human-equipment interaction errors.



Friday 1 May 2015

Scottish Clinical Skills Network (SCSN) annual conference 2015: The great and the challenges (by M Moneypenny)

Easterbrook conference centre, Dumfries
Disclaimer: The author is vice-chair of the SCSN


The great

The SCSN conference provides a unique opportunity to network with like-minded people from across Scotland and further afield (there were delegates from England, Finland and the US). Although email has made the world smaller, as a speaker at a recent conference said: "Emailing is not 'having a conversation'". The SCSN conference allows everybody the opportunity to have a conversation, to explore areas of mutual interest, strengthen old ties and make new ones.

The three keynote speakers approached clinical skills from very different angles and all three were worth the trip alone. Professor Hugh Barr, President of the Centre for Advancement of Interprofessional Education, discussed interprofessional learning and teaching, the real benefits it provides and the challenges faced by those of us who deliver it. Professor Ken Walker, chair of the Scottish Surgical Simulation Collaborative, discussed the need for innovation in a training unit, but cautioned against "too much storming, and not enough norming and performing." Professor Jennifer Cleland, chair of council at the Association for the Study of Medical Education, talked about moving away from prior academic attainment for medical school admission as it is a poor predictor of post-graduate performance. She informed us that attempts to widen access to medical school have failed and she also discussed the differences between values  (enduring beliefs) and personalities (enduring traits). 

The social programme was the right mix between entertainment and networking, with whisky tasting, recitation of Burns' poems and a thought-provoking speech considering what "The Bard" would have thought of the plight of the people trying to reach European shores from North Africa.

The challenges


Strength through
 collaboration
The strength of the network lies in its bottom-up grassroots nature, attracting members who are interested in clinical skills from across Scotland. The scope for collaboration is enormous. However the majority of the presentations and posters showcased research from a single institution. When research was collaborative, the most common partnership was between institutions in the same city (e.g. University of Aberdeen and Robert Gordon University, University of Glasgow and NHS GG&C). Notable exceptions were collaborations between the University of Aberdeen and the University of Ottawa, and a multi-agency exercise between the Scottish Fire and Rescue Service, the Scottish Ambulance Service, the Emergency Medical Retrieval Service and Yorkhill Children's Hospital. With a little planning it should be possible for much of the research to be carried out in multiple centres. This would take a bit more work but it would also make the results more robust, reduce the risk of repeating a similar (under-powered) study and improve the chances of asking the right questions in the first place. To encourage collaboration, future abstract submissions could have a  weighting for multi-centre studies.

Minor IT issues meant that some speaker's slides were not displayed correctly and a laptop failure meant that some speakers were unable to display their slides at all. A policy of requesting all slides to be uploaded on the first day and a back-up laptop should be able to minimise these problems in the future.

Final thoughts

One of the most well-attended Scottish health conferences in recent years, the get-together in Dumfries shows the continued relevance of the SCSN to the development and promotion of clinical skills training. The next conference is in Aberdeen on the 20th-21st April 2016. See you there?

Monday 27 April 2015

Book of the month: Medical error and patient safety: human factors in medicine (Peters & Peters)

About the authors

George and Barbara Peters are a father and daughter team. According to the included biography, George Peters is a multidisciplinary specialist, with experience as a safety specialist and as a design, reliability and quality engineer. Barbara Peters "has specialized in problem solving relating to medical error, safety, risk assessment, and environmental health hazards".

Who should read this book?

The authors state that this book is a basic textbook and reference manual "for those who may attempt to deal with and minimise medical error" (p.6). They go on to say that: "Most readers will have little difficulty understanding the discrete word phrases, simplified specialty language, unique concepts, and general suggestions for the improvement of patient safety by reducing medical error." (p.8) Unfortunately, as explained below, probably very few people should read this book.

In summary

The book is divided into 9 chapters:

  1. Introduction
  2. General Concepts
  3. Medical Services
  4. Medical Devices
  5. Analysis
  6. Human Factors
  7. Management Errors
  8. Communications
  9. Drug Delivery

What's good about this book?

The use of simulators for learning, practice, and refresher training
might help in… emergency, crisis or rare event scenarios (p.20)
The authors mirror Ronnie Glavin's question regarding why there has been so little change since the 1999 report "To Err is Human", saying: "There was no magic remedy, only a seemingly complex and intractable human behaviour problem" (p.2). The authors call for an increase in transparency with respect to medical error, for the harmonisation of standards (e.g. US, EU, international)

The use of simulation and simulators is considered and recommended (e.g. see photo caption) including the use of simulation for resilience or stress-testing "intended to discover and correct weaknesses in the system so that a hardened or more robust organisation will result".

There is the occasional interesting concept, e.g. an "equal status" program instituted at some hospitals in which "all personnel are considered equal and a vital part of the team" (p.23) They also suggest the need for "error detectives" who are authorised to cross organisational boundaries and hierarchies and provide a direct feedback loop to management. The authors also argue for the need for civility and that it should "prevail under normal, stressful, and even extraordinary circumstances." They recommend being honest and advocate the 3R approach (Regret, Reason, Reparation) to apologies. They also call for HF studies to start at the product design stage. Chapter 9, Drug Delivery, covers a number of useful concepts such as the use of warning symbols, labelling, and prescription directions  (such as the slightly unnecessary "Caution: This medicine may be taken with or without food" sticker on a Lisinopril container).

The caveats at the end of each chapter, rather than being caveats actually provide an overview of much of the covered material.

What's bad about this book?

The almost total lack of a narrative or co-ordinated, logical approach to any of the chapters gives this book an "Alice in Wonderland" feel without the great prose. The examples are numerous: 
  • The first sub-heading in "Intentional Bias"(p.15) is "Knee", the next is "Head"
  • In "Chapter 2: General Concepts", "Teamwork" (p.20) is stuck between "Harmonization" and "Rationalisation". 
  • In "Chapter 3: Medical Services", "Innocent Errors" (p.24) is found between "Civility" and "Patient Involvement" (and, no, the headings are not arranged alphabetically)
  • In "Chapter 7: Management Errors", which covers errors due to poor supervision or management decisions, one of the subheadings is "Home Use Devices". This section makes no reference to management.
There is no overview at the beginning of each chapter and although the authors claim to be providing a simplified language they write sentences such as:
"(Factor analysis) is primarily used to test the ranks of number matrices if statistical correlation coefficients are available and express a relationship between the variables" and
"However, there have been dramatic changes or improvements in medical knowledge and procedural skills, medical equipment and devices, pharmaceutical efficacy and safety, higher social expectations from the medical profession, informed consent, animal rights, the intrusion of regulatory and legal concepts of social responsibility, complex payment schemes and practices, and rapid growith of medical organisationss that stress highly interactive group behaviour, financial outcomes, and business goals" (p.108)
 The authors also make sweeping generalisations such as:
"Research has illustrated that there is considerable effort in some areas to locate surface antimicrobial agents" (p.42)
"All (hospital) hardware should be compatible with the limited capabilities of the aged, infirm, sick, and disabled" (p.53)
"Criticism is to be expected during periods of rapid transition. The ideal will become the reality." (p.54)
"Special training may be necessary to prevent situation awareness errors." (p.103)
"A radiating positive attitude should be displayed by managers." (p.140)

Puzzling sentences abound such as "Patients may have special childhood problems such as foreign objects stuck in the throat, respiratory tract, or other body openings. These objects may also include food, bones, nuts, or vegetables not properly chewed" 

And the occasional unnecessary fact is thrown into the mix: "When competing goals, such as fairness versus self-interest, are present, the brain areas involved and activated are the anterior insula and the right dorsolateral prefrontal cortex."(p.149) and "What may be surprising is that long-term potentiation is first induced in the hippocampal area... fast neuron-glia synaptic transmission has been found between CA1 hippocampal neurons and NG2 macroglial synapses" (p.167-168)

Final thoughts

In the Preface, the authors state that they tried to eliminate bias in the book: "There was no third-party direction or control." (p. xvi) Unfortunately this is exactly what the book is lacking, some sort of coordinating entity which might turn this, at times rambling, book into a worthwhile read. The book's title "Medical Error and Patient Safety:  Human Factors in Medicine" seems to have been applied in retrospect and none of the subjects is well-covered. In its present form this book should be given a wide berth. 

Friday 27 March 2015

Flight 4U 9525 and the dynamic Swiss cheese model

Swiss cheese model of accident causation
The Swiss cheese model
James Reason's Swiss cheese model explains how successive layers of defences can be breached, or weaknesses can line up, in order for an incident to occur.

The traditional depiction of the model is of a static succession of slices of cheese. In this model, closing one of the holes in the sequence prevents the incident.

A better way of visualising the concept is by thinking of a dynamic Swiss cheese model (see video). The weaknesses are not static and closing one weakness may cause another to open elsewhere in the same (or another) "slice".


Flight 4U 9525

The exact circumstances of the crash of the Germanwings Airbus 320 on the 24th of March 2015 have yet to be established. However it seems likely that the co-pilot intentionally flew the plane into the ground. The captain had probably left the flight deck to use the toilet and was then locked out of the cockpit by his first officer.

Post-9/11 cockpit doors

After the 9/11 terrorist attacks, cockpit doors were reinforced in order to prevent forced access. In terms of the Swiss cheese model, this weakness was therefore reduced. Crew could still access the cockpit if the pilot had become incapacitated by entering a keypad code. However, if the pilot was not incapacitated he or she could override the keypad system. Therefore a terrorist in possession of the code could still be prevented from getting into the cockpit.

The dynamic Swiss cheese model

The closing of the weakness in the structural/system layers allowing terrorists access to the cockpit opened a weakness in the set of circumstances where someone may want to access the cockpit for legitimate reasons against the flight crew's wishes. After the loss of Malaysia Airlines flight MH370, Popular Mechanics wrote a prescient article in March 2014 asking "Could Plane Cockpits Be Too Secure? Should pilots be allowed to lock themselves in the cockpit?" After the crash of Flight 4U 9525, in an attempt to close this new weakness, many airlines are now requiring the presence of two crew members on the flight deck at all times. It is unclear what new weaknesses this policy will create.

Lessons for the rest of us

Measures put in place in response to an incident will almost inevitably increase the risk of other, unforeseen incidents occurring. Time spent carrying out analyses and simulations of possible side-effects of the "fix" may allow us to minimise these new weaknesses.

Wednesday 4 March 2015

Book of the Month: Being Mortal (Illness, Medicine and What Matters in the End) by Atul Gawande (Reviewed by Kirsten Walthall @K_Walthall)

About the author


Atul Gawande is a general and endocrine surgeon based at the the Brigham and Women’s Hospital in Boston, USA. Gawande is also a staff writer for The New Yorker magazine and the author of four best-selling books. His latest work, published in late 2014, is titled “Being Mortal: Illness, Medicine and What Matters in the End”.

Who should read this book?


Everyone – healthcare providers, patients and the lay public. Although several issues are highlighted through a case series of patients with medical problems, the book does not focus on the ins and outs of medical matters, such as the specifics of treatments. Gawande’s easy style of writing makes this book accessible to all. 

In summary


This book explores the concept of mortality and the impact that modern day medicine has had on it.
Using a series of cases, Gawande discusses the experiences of several people as they grow old, some with life-limiting diseases and others who simply become frail. He looks at the struggle to retain independence and autonomy; how care systems often try to provide support in a regimented way. Gawande discusses the concept of  “assisted living”, which helps people to continue to live the lives that they have lived. Furthermore, Gawande explores the belief of healthcare professionals that they have failed when a patient dies. Many find it difficult to accept that medicine cannot fix everything, and therefore may give poor information to their patients about what they realistically expect medical management to accomplish. Gawande discusses the importance of having those hard conversations with patients to find out what matters most to each individual so that therapy and care can be tailored to them. He argues that what we should be striving for is maintaining quality of life until death, rather than just prolonging life itself.

What’s good about this book?

Gawande uses #whatmattersmost on Twitter

The use of case studies and personal experiences to explore the issues involved in growing old and dying engage the reader. Gawande’s writing style makes “Being Mortal” very easy to read despite the potentially heavy subject matter. Mortality was not well covered in my undergraduate training – indeed it was barely touched upon – and I suspect that this is the same across the board in undergraduate medical education. This impression is supported by a study by Bowden et al (2013) who found that Foundation Year doctors expressed a lack of readiness to deliver end of life support and care.  “Being Mortal” really makes the reader think about the latter stages of life and the importance of preserving what matters most to each individual. It gives the reader an understanding of mortality that, for the healthcare professional, will benefit her patients and, for the individual, will benefit her, her relatives and her friends.

 
What’s bad about this book?


This is not a quick read book. It is very thought-provoking and encourages discussion - you will need time to read, absorb and think about its contents.

Final thoughts


This is by far the most inspirational and thought provoking book I have read – a must-read for anyone involved in patient care.

Reference


Bowden, J., Dempsey, K., Boyd. K., Fallon. M. and Murray. S.A. (2013) Are newly qualified doctors prepared to provide supportive and end-of-life care? A survey of Foundation Year 1 doctors and consultants, Journal of the Royal College of Physicians of Edinburgh. 43 pp.24-28 [Online] Available at: http://www.rcpe.ac.uk/sites/default/files/bowden.pdf (Accessed: 02 March 2015)