Beyond blame

Professor Peter McCulloch, co-director of the Patient Safety Academy, describes the benefits of a human factors approach to investigating medical error

  • Date: 21 July 2020

A TRAINEE in ITU puts suction on a postoperative drain despite clear instructions from the surgeon not to, resulting in serious morbidity.

A neurosurgeon inadvertently opens the wrong side of the head.

An FY1 doctor neglects to restart anticoagulants for a patient after chest drain insertion, resulting in a pulmonary embolus.

A core trainee consents the wrong patient for an emergency operation, and the mistake is only discovered once the patient is anaesthetised.

An experienced paediatric registrar fails to recognise signs of serious sepsis in a child with special needs, who subsequently dies.

These are examples of the serious clinical incidents (suitably modified to ensure anonymity) that we have investigated in the last few years at the Patient Safety Academy (PSA)*, coming from a range of hospitals throughout England.

In each case the doctor most directly linked to the incident was under intense scrutiny, and the stakes were high. At the milder end of the spectrum, the doctor’s reputation and career could have suffered serious damage. At the other end, the doctor faced erasure from the General Medical Council (GMC) Register and even prison. To underline this, the last vignette is, of course, not one of our cases but that of Dr Hadiza Bawa-Garba, who was convicted of gross negligence manslaughter and whose case became a cause celebre.

The cases all have a number of things in common. Perhaps the most important is that, when analysed using a human factors approach, the prima facie impression of an inexcusable lapse from professional standards is considerably modified. In the third case, for example, we found that the patient was inappropriately referred and consequently managed very indecisively, leaving the FY1 in a confusing guessing game about the right anticoagulant treatment. In none of the PSA cases was any action ultimately taken against the doctor involved because the human factors analysis showed that this would be completely unreasonable.

This doesn’t mean human factors are a kind of “get out of jail free” card for low clinical standards. Some analyses still conclude that the actions of the doctor at the “sharp end” were unacceptable, even once all the circumstances are considered. But many more do not, and the frequency with which this occurs illustrates the excessive fixation of our clinical culture on individual accountability – or to put it more bluntly, on blame.

Human factors redress the balance by focusing on how the system failed rather than who was responsible. The important advantage of this approach to analysing error is that it is much more effective at identifying ways to prevent it happening again. Punishing or expelling an individual is a very ineffective method for doing this, even if done in a way which is calculated to strike maximum fear in others. Our investigations repeatedly demonstrate that disasters are multifactorial events, and the human being unlucky enough to be most closely involved is rarely the deciding factor.

Forensic sweep

Human factors or ergonomics grew out of Taylorism¹, an approach to manufacturing which emphasised analysing and improving system efficiency by minutely studying the details of its function. It has evolved into a discipline which systematically analyses systems of work involving humans, looking especially at the interactions between workers, their equipment and the environment they work in.

Healthcare has been slow in adopting a human factors approach to analyses of error. Civil aviation led the way from the 1960s onwards, but nuclear power, oil and gas extraction, rail and maritime transport, construction and the military are amongst other sectors where it is now routine. Typically, a human factors based investigation is led by specialists with a thorough professional grounding in the techniques and principles, aided by content experts who can explain the technicalities of the system being studied.

Crucially, teams are external to the organisation being studied and have no conflict of interest – a stark contrast to current NHS practice. A thorough forensic sweep is made for relevant evidence. This includes not only written and electronic records but also interviews with staff and managers involved. These are constructed so as to throw light on the entire range of potentially relevant factors.

Simple models are used to ensure that all the dimensions of an organisation are considered. A commonly used model is SEIPS², which classifies influences under people, organisation, environment, task and tools, whilst another, the 3D model³, simply focuses on system, culture and technology. Once the basic facts, the timeline and the key influences are established, additional techniques can be used to identify specific weak points which create “accidents waiting to happen” and focus attention on these. Recently, human factors has started attempting to include analysis of system strengths as well as weaknesses in a movement labelled as “Safety 2” or Resilience Engineeringā“.

Implementation challenge

The potential value of a human factors approach to improving the safety of healthcare is pretty obvious. Implementing it is, however, a huge challenge. Our culture and way of thinking, the structures of healthcare organisations and the approach of the powers which govern us (including the GMC) have for generations been based on a completely different model.

The idea of professionalism in the NHS is steeped in Victorian ideas of virtue, which promote a heroic (and unattainable) model of the vocationally called doctor/nurse/midwife/other selflessly dedicating themselves to the good of their patients and overcoming all obstacles through perseverance and moral rectitude. Belief in this ideal has probably been one of the biggest factors in allowing the NHS to survive for so long, as it has stimulated countless thousands of us to try harder for longer, and to aspire to high standards of performance without seeking rewards. But its major downside is a culture of blame for those who fail, which leads inevitably to fear, guilt, hypocrisy and – most importantly – reluctance to discuss error openly and rationally.

The GMC has a duty to protect the public from doctors who should not be practising. This necessary task will not be eliminated by changing the way in which context is evaluated, but it might be made fairer. At present the GMC are placed in the unenviable position of judging doctors using an outdated legislative framework which limits their room for manoeuvre. The result has been a serious loss of confidence by the profession in their regulator.

To their credit, the GMC recognised the scale of the problem before the Bawa-Garba case became a media sensation, and were already seeking help in initiating changeāµ. We have since been providing human factors training to their Fitness to Practice division and, equally important, working with them to see how their process for conducting investigations can be modified to ensure human factors are always taken into account. To diminish the reluctance of doctors to be open about error will, however, require the GMC not only to change but to be seen to change, and this will require careful messaging over a period of years.

COVID-19 in the mix

Currently, of course, COVID-19 is massively affecting patient safety and its investigation, as it has every facet of healthcare. The basics of the human factors approach to analysis are not affected – but the relevant influences on human behaviour and decision-making have been changed hugely, and in many cases the factors making error more likely have increased. The greatest danger of the COVID-19 phenomenon is that investigation is simply jettisoned in the cause of diverting all resources to the need to treat.

As the current surge in the pandemic wanes and we look towards an uncertain future, a rapid and dedicated human-factors-based analysis of what we got right and wrong in patient care would be a wise investment so as to prepare our system for the expected second wave.

Peter McCulloch is a professor of surgery at the University of Oxford and co-director of the Patient Safety Academy

References

  1. Nelson D, Campbell S. Taylorism Versus Welfare Work in American Industry: H. L. Gantt and the Bancrofts. Business History Review 1972; 46(1): 1-16. DOI: https://doi.org/10.2307/3112773
  2. Carayon P, Schoofs Hundt A, Karsh B-T, Gurses AP, Alvarado CJ, Smith M, Flatley Brennan P. Work system design for patient safety: the SEIPS model. Qual Saf Health Care. 2006 Dec; 15(Suppl 1): i50–i58. doi: 10.1136/qshc.2005.015842 PMCID: PMC2464868 PMID: 17142610
  3. McCulloch P, Catchpole K. A three-dimensional model of error and safety in surgical health care microsystems. Rationale, development and initial testing. BMC Surg. 2011; 11: 23. Published online 2011 Sep 5. doi: 10.1186/1471-2482-11-23. PMCID: PMC3178466. PMID: 21892929
  4. Hollnagel E, Woods DD, Leveson N Eds. Resilinece Engineering; Concepts and Precepts. Ashgate Publishing 2006
  5. Morgan L, Benson D & McCulloch P. Will human factors restore faith in the GMC? BMJ 2019; 364 doi: https://doi.org/10.1136/bmj.l1037

* The Patient Safety Academy was formed from collaboration between two research groups in the University of Oxford, with expertise in patient safety, human factors/ergonomics and improvement science: the Quality, Reliability, Safety and Teamwork Unit (QRSTU) and Oxford Simulation Teaching and Research (OxSTaR).

This page was correct at the time of publication. Any guidance is intended as general guidance for members only. If you are a member and need specific advice relating to your own circumstances, please contact one of our advisers.

Read more from this issue of Insight Secondary

Insight - Secondary is published quarterly and distributed to MDDUS members throughout the UK who work in secondary care. It provides a mix of articles on risk, medico-legal and regulatory matters as well as general features and profiles of interest to our members.
In this issue
Insight Secondary Q3

Related Content

Roundtable part 2 - Diagnosing conditions with a slower progression

Roundtable part 1 - Dealing with serious childhood illnesses

Medico-legal principles

Save this article

Save this article to a list of favourite articles which members can access in their account.

Save to library

For registration, or any login issues, please visit our login page.