Accidents will happen

Adam Campbell hears how personal tragedy led airline pilot Martin Bromiley to found a charity dedicated to reducing the incidence of human error in healthcare

“IT WAS the worst thing anyone would want to hear,” says Martin Bromiley, recounting the moment he was told his wife, Elaine, was in intensive care. Only a few hours earlier he had dropped her off at a private hospital for a routine sinus operation. An airline pilot, Martin says that after the initial shock he very quickly went into “pilot mode”.

“I thought the important thing now is that Elaine’s life is saved. My focus for three or four days was very much about being there and doing the best I could to make sure that whatever could be done was done,” says the 52-year-old.

It was 29 March 2005. Earlier that morning, at 8.35am, Elaine had been anaesthetised in preparation for the operation. Almost immediately things started to go wrong. Increased tone in her jaw muscles was preventing insertion of the laryngeal mask airway. Four minutes later her oxygen saturation had deteriorated to 40 per cent and attempts to ventilate her lungs continued to be unsuccessful. The consultant anaesthetist tried a tracheal intubation at around 8.43am, but this too failed.

By this time there were two anaesthetists, an ENT surgeon and at least three nurses in the room. Shocked at Elaine’s vital signs and colour, one nurse went out and booked an intensive care bed. Another asked her colleague to fetch a tracheostomy set. Both of these measures were considered over-reactions by the consultants as they continued to attempt intubation. The bed was cancelled, the tracheostomy set unused. When an intubating laryngeal mask was finally inserted at 9am, Elaine had already gone 20 minutes with severe oxygen starvation.

At 11am, Martin got the call to say Elaine had been admitted to intensive care at a nearby NHS hospital. On arrival he was told she might have significant brain damage. A few days later, confronted by the reality of her situation, the decision was made to switch off her life support.

What happened next – a journey that would lead to the setting up of the Clinical Human Factors Group (CHFG), a charity dedicated to reducing the incidence of human error in healthcare – was entirely unplanned but had much to do with Martin’s training as a pilot.

Not about blame

First there was his discovery that there was no plan to investigate the incident. The very idea was anathema to someone from the aviation industry. So he pressed the case, while making clear to the director of the private clinic: “this is about learning; it’s not about trying to blame anybody. My thought at the time was that the clinicians did absolutely everything they could and that there might be some small lessons that could be learned”.

The investigation and the subsequent inquest, however, highlighted numerous areas where things should have been done differently. Generally, there had been a loss of awareness of time, of the seriousness of the situation, a breakdown in the decision-making processes and in communication among the consultants. The nurses said they knew what was supposed to happen but they didn’t know how to broach the subject.

This is about learning; it’s not about trying to blame anybody

Clearly, thought Martin, there were some rather large lessons that needed to be learned. “I recognised that here were failings that had to do with human factors and non-technical skills,” he says – human factors being all the things that make people different from logical, predictable machines.

As a pilot, he was used to an industry where technical skills were rarely taught without an element of the non-technical. So that when pilots are taught about a new piece of equipment, for example, there will also be a discussion regarding why they might choose not to use it in an emergency, and how colleagues can help to make sure it is used effectively. “I suddenly realised that here was a safety-critical environment which doesn’t seem to work in the way that other safety-critical environments work.”

Spreading the message

With two small children to look after on his own, Martin decided to cut his flying time by 50 per cent. This meant he had the odd afternoon here and there, and he kept coming back to this question of human factors in healthcare. So he began talking to people about it – academics, policymakers, clinicians, the National Patient Safety Agency – a telephone call here, an email there.

“I didn’t really have a plan but over two years I built up a picture of some really good work going on in health. But these were really tiny pockets of work, and they weren’t connected.”

He’d seen a human factors group involving policymakers, academics and pilots develop in aviation in the 1990s, starting almost as a hobby, and eventually become part of the Royal Aeronautical Society, so he decided to organise a meeting in London. Perhaps testament to his powers of persuasion, 45 people from his list of 80 names turned up – “all sorts of characters”, he remembers.

After that first meeting, it was suggested that if they were to keep it up they would need some kind of capacity for booking meeting rooms, paying expenses, and that logically they should set themselves up as a charity. And so, with a £5,000 grant secured by Martin from the Health Foundation, the CHFG was born.

After discussions with his employer, he decided to continue flying at around half the time and dedicate the remainder, in an unpaid capacity, to spreading the clinical human factors message.

That was nine years ago, and in the intervening period the number of active supporters has grown to around 3,000 people across the UK. The CHFG run free seminars and conferences, publish guidance documents and illustrative health stories, and generally do their utmost to promote dialogue and sharing of their ideas across the healthcare spectrum. Their goal is to show that a better understanding of the role of human factors can have a significant impact on safety, quality and productivity.

In day-to-day terms, Martin explains, it’s about encouraging a system that, for example, makes it difficult to give someone the wrong drug – through better labelling and more standardised storage procedures – and more acceptable to double-check with colleagues that it’s the right drug, at the right dosage and by the correct route, even under pressure of time.

At bottom, he says, it’s about having multiple lines of defence that take into account that “no matter how good or intelligent or knowledgeable you are, you can still get it wrong”.

The NHS is a many-headed behemoth, of course, and the greatest challenge is altering policy. “A lot of my time is spent trying to persuade people at policy level about ways they can redesign the system to do it better. I’m not an expert, but what I can do is at least overcome some of the inertia and provide some motivation for people to go out and get that expertise.”

There has been change over the years. Whereas at the beginning he would find about one per cent of clinicians had heard about human factors, nowadays it’s a majority, even if they don’t understand what it is. “The teaching of it is much more widespread, but we’re still a long way from embedding it.”

Thank you for speaking up

Embedding a human factors approach on the personal level, says Martin, begins by clinicians asking themselves: what can I do with my behaviour that’s going to encourage people to be safe around me? One answer, he suggests, is to ask open questions. “You might walk into a situation and know exactly how to deal with it. But you should stop yourself and ask a more junior colleague, how do you think we should deal with this? It not only helps to develop them, but more importantly they might well see something you don’t.”

Another is to thank people for speaking up. “That encouragement is so important. I’ve had people saying to me when I’m flying, Martin, don’t forget such and such. Half the time I think, yes, well I was going to do that anyway. But I say thank you, because I know that next time I might genuinely have forgotten and be about to make a complete idiot of myself.

“It’s about humility because we are all so capable of screwing up. Safety in a complex world cannot be delivered by just one person, it has to be delivered by a team.”

Adam Campbell is a freelance journalist and regular contributor to MDDUS publications