RISK: Thinking errors

Risk adviser Alan Frame considers errors in diagnosis and why doctors make them. Feature article from Winter 2015 issue of MDDUS Summons

WE all know that mistakes can happen in medical practice and thanks to our ongoing ‘cause-of-loss’ analysis here at MDDUS we now have a greater insight into the nature of these errors. Our research recently revealed that around 60 per cent of negligence claims against our GP members are related to alleged failure to diagnose.

So the next logical question is why do these errors occur? While some incidents can be blamed on system errors, such as the misfiling of a test result, many can be attributed to what are known as ‘thinking errors’.

Psychologists have identified several reasons why we humans make mistakes in our thought processes, despite possessing the knowledge and ability to think correctly. In 2003, the journal Academic Medicine listed examples of thinking errors, including:

Anchoring bias – locking on to a diagnosis too soon and failing to adjust to new information.

Availability bias – thinking that a similar recent presentation you have encountered is also happening in the present situation. • Confirmation bias – looking for evidence to support a pre-conceived opinion or notion, rather than looking for information to prove oneself wrong.

Diagnosis momentum – accepting a previous diagnosis without applying sufficient scepticism.

Overconfidence bias – over-reliance on one’s own ability, intuition and judgement.

Premature closure – similar to confirmation bias, but more like jumping to a conclusion.

Search-satisfying bias – a “eureka” moment that stops all further thought on the matter.

The most common thinking error in the practice of medicine is said to be the anchoring bias, where doctors may jump to premature conclusions by assuming that they’re thinking about things in the right context when they may not be. This may lead to a failure to undertake a broader search for other possibilities.

Zebras or horses?

A recently closed MDDUS claim illustrates some of the above points in action. The case involved the failure to diagnose a 22-week gestation pregnancy in a 16-year-old presenting with symptoms of vomiting and amenorrhea. The girl denied having had sex and a pregnancy test was not carried out. She went on to deliver a healthy baby but developed post-natal depression. She raised a claim against her GP arguing that, had she known of the pregnancy, she would have requested a termination. MDDUS sought an expert opinion on liability which suggested a “weak defence” and a settlement was agreed before trial.

An important feature in this case was that the patient had presented on several occasions to out-of-hours services and other GPs complaining of the same symptoms. An existing three-year history of sinusitis was assumed to be the cause of her difficulties early in the encounter, and this was diagnosed as the reason behind her nausea and vomiting. A period of three months then passed during which several consultations took place, before a home pregnancy test proved to be positive.

Our expert made a number of observations about her care. He noted an over-reliance on previous diagnosis and explanations offered by other doctors (confirmation bias, anchoring bias, diagnosis momentum) and an over-reliance on the patient’s denial of sexual activity made in the presence of her mother, who was evidently “very involved” in her daughter’s care (overconfidence bias, premature closure).

The expert commented that it seemed strange that when confronted with a 16-year-old having missed periods on several occasions the GP did not question whether the girl might be pregnant.

Our member conceded that he had not considered the possibility of pregnancy but he pointed out that other doctors had not either. There had been no mention of missed periods at this time and he was also not aware that the girl was sexually active.

So how can we guard against the thinking errors illustrated in this case? Some experts support greater use of information technology to help us overcome our natural biases and hopefully avoid diagnostic errors. But health IT has its own biases as well. Remember GIGO – garbage in, garbage out!

Diagnostic tools such as template charting provide an illustrative example. A patient tells a clinician: “I’ve been vomiting and my chest hurts.” If the clinician plumps for a template for vomiting, gastroenteritis or abdominal pain too quickly, they could easily lead themselves up the garden path, overlooking the fact that what the patient really meant to say was: “I started having this really heavy chest pain and have been vomiting ever since.” Using the template first selected could lead to the patient being discharged with an undiagnosed MI.

Jumping to conclusions

Consider another case. An overweight lady on the contraceptive pill presents to her doctor complaining of pain in her left calf. GP A is unsure of the diagnosis but appears to consider and eliminate the possibility of a deep vein thrombosis (DVT). The patient then consults GP B at the same practice seven days later and now also complains of chest pain on deep inspiration. GP B treats the patient with a non-steroidal drug and sends her on her way.

Later that same evening an out-of-hours doctor suspects a pulmonary embolism and admits her to hospital. The diagnosis is confirmed and fortunately the lady is treated successfully and makes a full recovery. An allegation of medical negligence is subsequently made against both GPs, claiming that they failed to note the gravity of the situation and that such a failure was not reasonable.

An MDDUS expert witness is asked to review the care provided. The patient was overweight and on oral contraceptives, both known risk factors for DVT. Our expert notes that GP A, while having no specific recall of the consultation, was sure that she would have considered a DVT as a possible diagnosis as she had taken and recorded a measurement of the calf circumference in each leg. She was apparently reassured that the circumferences were equal and that there was no heat, redness or oedema present. There was a rash noted on one of her feet but it was not recorded on which one.

Importantly, in her record of the consultation, GP A did not say what she considered to be the diagnosis or even if she had a working hypothesis. Our expert also observes that with “even the most careful clinical examination of a patient such as this, it is very difficult to exclude a DVT by examination alone, and that most ordinary competent GPs should know this”. Our expert considers that GP A would have been wise to arrange for the patient to be assessed that day at the local hospital. He also expresses “some surprise” that GP B, seven days after the initial presentation, did not immediately fear that the lady had suffered a DVT and then a pulmonary embolism. He remarks: “What else would give rise to a sore calf and later pain in the chest on inspiration?”

Our expert concludes that it would be difficult to defend GP A’s actions in failing to arrange for an urgent hospital admission. He also finds it difficult to understand why GP B did not make the diagnosis seven days later.

So what thinking errors were at play here? GP A apparently considered a diagnosis of DVT at the initial consultation as evidenced by her taking calf measurements. But for some reason this diagnosis was discounted. What reassured her? Was it the uniformity of the measurements, the lack of any apparent heat, redness or oedema?

Our expert witness identified significant risk factors that should have set alarm bells ringing. Was there an issue with an inadequate history which would have flagged up the likelihood of a DVT before any physical examination actually took place? We will never know in this particular case, largely due to GP A’s poor memory recall and an absence of any reference to history taking in the consultation notes.

We can only speculate why this may have happened. Confirmation bias is a common thinking error where a doctor may look for evidence to support a preconceived opinion or notion. Closely related to this is the concept of premature closure, which involves more of a “jumping to a conclusion”, i.e. did the equal calf circumference measurements rule out the option of a DVT in the doctor’s mind?

Doctors typically generate several possible diagnoses early in their encounter with a clinical problem. Premature closure can occur when a conclusion is reached before it has been fully verified. The tendency to apply closure to the problem-solving process can result from vivid presenting features that may be convincing for a particular diagnosis or by anchoring on to salient features early in the presentation.

Such thinking problems can be at least partially avoided by simply being aware that they exist and, of course, there is no substitute for experience.

Alan Frame is a risk adviser at MDDUS