In their book "Managing Maintenance Error: A practical guide" Reason and Hobbs inform us that most errors are predictable:
"...more than half of the human factors incidents in maintenance are recognised as having occurred before, often many times" (p.98)
In healthcare, a similar pattern emerges. For example, a nasogastric tube is wrongly placed into a patient's lungs and the liquid feed is started. In England and Wales, from 2005-2011 twenty-one people died as result of this error. The commonest drug error in obstetric anaesthesia is mistaking thiopentone for an antibiotic and vice-versa. In the UK, there was at least one incident in 2010, two in 2011, and one in 2012.
These are examples of error traps. In the seascape of human performance, error traps act as whirlpools, seizing the inexperienced, the tired and the distracted. James Reason tells us that the defence against error traps is organisational. In anaesthesia, the Safe Anaesthesia Liaison Group (SALG) publishes patient safety updates which detail adverse incidents and provide suggestions for avoidance and mitigation. The National Patient Safety Agency (NPSA) performed a similar role for the rest of the healthcare system, but it was disbanded in June 2012, its activity subsumed within NHS England. Many individual departments have morbidity and mortality (M&M) meetings where adverse events are discussed and defences created or adjusted. In addition, individuals will create their own personal defences, such as always having the antibiotic in a 30-ml syringe, triply-labelled.
There are problems with all of these solutions. The patient safety updates are not mandatory reading, there is no assessment of the individual or the department or the hospital to ensure that lessons have been learnt. Attendance at M&M meetings can be variable and sharing of the discussions and conclusions may be sporadic. Individual defences may be breached due to performance degradation or by another healthcare worker who is not aware of them.
|Sisyphus (by Titian)|
Sisyphus, a deceitful king in ancient Greece, attracted the wrath of Zeus. His punishment: for all eternity he would be forced to roll a boulder up a steep hill, only for it to return to the bottom. In a similar fashion we are doomed to repeat maybe not our mistakes (because we create personal defences) but the mistakes of others. It is extremely likely that the error we were involved with today was made by someone else somewhere else last week or last month.
A number of solutions are called for. Nationally, a system for reporting errors, such as the National Reporting and Learning System (NRLS). These error reports need to be analysed by clinicians and human factors experts to reveal error traps. Also nationally, a mandatory requirement for healthcare personnel to inform themselves of adverse events which are occurring in their field. On a local level, a robust reporting system which feeds into the national system as well as a safety culture, including M&M meetings, which encourages and rewards the reporting of adverse events and near misses. Also on a local level, defined responsibility and accountability for maintaining and modifying system defences.
The role of simulation
Simulation has a number of roles to play. First, systems testing using simulated events, such as a major haemorrhage or a fire, if performed correctly, can reveal weaknesses in the defences. Second, the errors that participants make in the simulation centre are likely to occur in the workplace. For example, in the past few years we have had 2 incidents where, in a crisis, a participant has switched off the anaesthetic machine when they meant to switch on the suction. As can be seen from the image, this is an understandable error. The same error occurred in the actual operating theatre. (It is likely that the same error has occurred a number of times across the UK, as the manufacturer has now designed a clear plastic lid for the anaesthetic machine switch, thus creating a physical barrier.) Do we, as simulation centres, have a duty to flag up common errors to the safety agencies? The third role for simulation is to raise awareness of performance limitations and error traps. Although relying on the person "at the sharp end" to defend against all errors is wrong, it is often that person who acts as the final defence when the system breaks down. Making everybody involved more aware of error traps can therefore only be a good thing.