Errors in diagnosis and why doctors make them

We all know that mistakes can happen in medical practice and thanks to our ‘cause of loss’ analysis we now have a greater insight into the nature of these errors. Our research recently revealed that a whopping 60 per cent of MDDUS negligence claims against GPs are related to a failure to diagnose. 

We all know that mistakes can happen in medical practice and thanks to our ‘cause of loss’ analysis we now have a greater insight into the nature of these errors. Our research recently revealed that a whopping 60 per cent of MDDUS negligence claims against GPs are related to a failure to diagnose. (See my previous blog.)


So the next logical question might be why do these errors occur? While some incidents can be blamed on system errors, such as the misfiling of a test result, many can be attributed to what are known as ‘thinking errors’.

Psychologists have identified several reasons why we humans make mistakes in our thought processes despite possessing the knowledge and ability to think correctly. In 2003 the journal Academic Medicine listed examples of thinking errors, including:

Anchoring bias – locking on to a diagnosis too soon and failing to adjust to new information.

Availability bias – thinking that a similar recent presentation you have encountered is also happening in the present situation.

Confirmation bias – looking for evidence to support a pre-conceived opinion or notion, rather than looking for information to prove oneself wrong.

Diagnosis momentum – accepting a previous diagnosis without applying sufficient scepticism.

Overconfidence bias – Over-reliance on one’s own ability, intuition and judgement.

Premature closure – similar to confirmation bias, but more like jumping to a conclusion.

Search-satisfying bias – A “eureka” moment that stops all further thought on the matter.

The most common thinking error in the practice of medicine is said to be “anchoring bias’’, where doctors may jump to premature conclusions by assuming that they’re thinking about things in the right context when they may not be. They may then fail to undertake a broader search for other possibilities.

Zebras or horses?

A recently closed MDDUS claim illustrates some of the above points in action.

The case involves the failure to diagnose a 22-week gestation pregnancy in a 16-year-old presenting with symptoms of vomiting and amenorrhea. She denies having had sex and a pregnancy test is not carried out. The girl goes on to deliver a healthy baby but develops post natal depression. She raises a claim against her GP arguing that, had she known she was pregnant, she would have requested a termination.

MDDUS seeks an expert opinion on liability which suggests a “weak defence” and a settlement is agreed before trial

An important feature in this case was that the patient had presented on several occasions to out of hours services and other GPs complaining of the same symptoms. An existing three-year history of sinusitis was assumed to be the cause of her present difficulties early in the encounter, and this was diagnosed as the cause of her nausea and vomiting. A period of three months then passed during which several consultations took place, before a home pregnancy test proved to be positive.

Our expert made a number of observations about her care, including:

  • There had been an over-reliance on previous diagnosis and explanations offered by other doctors. (Confirmation bias, Anchoring bias, Diagnosis momentum)
  • An over- reliance on the patient’s denial of sexual activity made in the presence of her mother, who was evidently ‘very involved’ in her daughter’s care. (Overconfidence bias, Premature closure)

Our expert noted:

“It seems exceedingly strange when confronted with a 16-year-old on several occasions having missed periods – not to enquire concerning the possibility that she is pregnant.”

Our member conceded the following in relation to their initial consultation:

“I did not consider the possibility of pregnancy but note that other doctors had not either, and there had been no mention of missed periods at this time. I was not aware that she was sexually active”.

Within this thinking we can see characteristics of several identifiable thinking errors.

Our expert concluded that a patient who is seen by one or even two doctors potentially gains an advantage from better continuity of care. But a patient seen by a number of doctors about the same problem runs the risk that no particular practitioner will ultimately take responsibility for a full review of presenting symptoms.

So, can we guard against this occurring? Some experts support greater use of information technology to help us overcome our natural biases and hopefully avoid diagnostic errors. But health IT has its own biases as well. Remember GIGO – garbage in, garbage out!

Diagnostic tools such as template charting provide an illustrative example. A patient tells a clinician: “I’ve been vomiting and my chest hurts.” If the clinician plumps for a template for vomiting, gastroenteritis or abdominal pain too quickly, they could easily lead themselves up the garden path, causing them to overlook the fact that what the patient really meant to say was: “I started having this really heavy chest pain and have been vomiting ever since.” If the template the GP first selected is now inaccurate, the patient may well be discharged with an undiagnosed MI.

Thinking problems can be at least partially avoided by simply being aware that they exist and, of course, there is no substitute for experience.

Our analysis of our cause of loss data is posing some challenging questions. We are in an era when considerable attention and resources are quite rightly being directed towards reducing and preventing system errors and failures. But these errors alone do not account for the fact that the practice of medicine will always involve the use of judgement and critical thinking by individual doctors and clinicians when making a diagnosis.

As always, we would be delighted to hear about any similar experiences members have encountered, as well as tips and suggestions to guard against thinking error bias.