Medical errors: necessary fallibility!

In my last blog I highlighted the case of a missed lung cancer diagnosis where several GPs in the same practice failed to appreciate the potential seriousness of the presenting symptoms, leading to a delayed diagnosis, and ultimately a hastened death.

  • Date: 27 November 2014

In my last blog I highlighted the case of a missed lung cancer diagnosis where several GPs in the same practice failed to appreciate the potential seriousness of the presenting symptoms, leading to a delayed diagnosis, and ultimately a hastened death.

At the heart of these we have been looking at the potential contribution of so-called ‘thinking errors’ which can be caused by employing cognitive bias during consultations.

I was therefore interested to note the findings of a recent Scottish Public Service Ombudsman report about a patient death from bowel cancer which highlighted several areas of concern. It noted an apparent willingness ‘in general’ by doctors at the practice in question to consistently adopt an approach which assumed that a ‘low risk’ explanation was likely, rather than considering the opposite possibility.

In this particular case a GP recorded early on that the patient’s symptoms were likely to be attributed to a more common and less serious medical problem. However, the Ombudsman’s expert opinion was very clear, in that they could equally represent a more sinister diagnosis.

Now hindsight is of course a wonderful entity as was borne out in this case, however the Ombudsman also quite reasonably pointed out that practitioners need to be prepared to shift how they weigh up the balance of risk. Only by changing this balance of risk assessment could more cancers be diagnosed early.

A suggested approach could involve an initial and serious consideration of a more sinister diagnosis followed by a full investigation and, where indicated, an early referral for specialist investigations. The Ombudsman’s expert was of the opinion that to achieve this consistently it was a change of attitude that was required, rather than necessarily a change in their risk assessment processes.

So why in this case, like many others we deal with at MDDUS, might a doctor fail to figure out what was going on and act on it? In previous blogs we have looked at theories postulated by psychologists which may lead to ‘thinking errors’ in diagnosis. But can we look elsewhere for other approaches or explanations?

The philosophers Samuel Gorovitz and Alasdair MacIntyre wrote in 1976 on the nature of human fallibility. They hypothesised that there are two primary reasons why as human beings we might fail. The first reason is through ignorance: we have only a limited understanding of all of the relevant physical laws and conditions that apply to any given problem or circumstance. The second reason, however, they identified as ‘ineptitude’, meaning that the knowledge exists but an individual or a group of individuals fail to apply that knowledge correctly.

Assuming that sufficient safeguards exist in medicine to mitigate against but not exclude the first of these, the vast majority of claims we manage at MDDUS probably relate to the second phenomenon.

So if the science or knowledge is concerned with universalities - universal truths, laws of how the body or the world behaves - the application is concerned with the particularities, and the test is how the universalities apply to the particularities.

And here Gorovitz and MacIntyre suggested a third possible type of failure beyond ignorance and ineptitude. They suggested that there is ‘necessary fallibility’, manifested as some knowledge that science can never deliver on.

To illustrate this they gave the example of how a hurricane will behave in determining where and when it will make landfall, and how fast it will be going when it does. Here, meteorologists are asking science to do more than it can, as they try to predict what’s going to happen. Hurricanes follow predictable laws of behaviour, but no hurricane is like any other hurricane. Each is unique, just like each human being.

Dr Atul Gawande in his 2014 Reith lecture – ‘Why do doctors fail?’ states: “We have 13 different organ systems and at the latest count we’ve identified more than 60,000 ways that they can go awry. The body is scarily intricate, unfathomable, hard to read. We are these hidden beings inside this fleshy sack of skin and we’ve spent thousands of years trying to understand what’s been going on inside. So the story of medicine to me is the story of how we deal with the incompleteness of our knowledge and the fallibility of our skills.”

Gorovitz and MacIntyre suggest that we therefore cannot have perfect knowledge of, for example, a hurricane (or the human body) short of having a complete understanding of all the laws that describe natural processes and a complete state description of the world. It required, in other words, omniscience, and we can’t have that.

Gawande meanwhile asks an intriguing question: how do we cope? Now it’s not that it’s impossible to predict anything. Some things are completely predictable and Gorovitz and MacIntyre cite the example of a random ice cube in a fire. ”An ice cube is so simple and so like all the other ice cubes that you can have complete assurance that you put it in the fire, it will melt”. The everyday question for us, however, is are human beings more like hurricanes or are we more like ice cubes?

He suggests that in more and more ways there are encouraging signs that we are as knowable as ice cubes. ”We understand with great precision how mothers can die in childbirth, how certain tumours behave, how the Ebola virus spreads, how the heart can go wrong and be fixed. We have many, many areas of continuing ignorance – Alzheimer’s disease and what we can do about it, metastatic cancers, how we might make a vaccine against this virus we’re dealing with now.”

Gawande states that it may be uncomfortable for doctors to look inside their fallibility. That they have a fear of looking. He suggests that the place we’ve come to is similar to the original doctors who dug up the bodies in the 19th century and dissected them in order to know what was going on inside. When we look inside our bodies now we look further, right inside our systems and how they really work and interact. He says they’re messier than we knew and sometimes messier than we might have wanted to know.

So, we don’t and won’t get it right all the time. Is there a fear that admitting this will only make patients more angry than accepting that we’re simply ignorant about certain areas of medicine?

As always, thoughts and opinions are sought on this more philosophical approach to medical errors?

http://www.nice.org.uk/news/press-and-media/nice-updating-guidance-for-faster-cancer-diagnoses

 

This page was correct at the time of publication. Any guidance is intended as general guidance for members only. If you are a member and need specific advice relating to your own circumstances, please contact one of our advisers.

Save this article

Save this article to a list of favourite articles which members can access in their account.

Save to library

Related Content

Coroner's inquests

Equality, diversity and inclusion workshop

Equality, diversity and inclusion workshop

For registration, or any login issues, please visit our login page.