Memorandum by the Clinical Human Factors
Group (CHFG) (PS 24)
1. THE CLINICAL
1.1 Since 2007 the Clinical Human Factors
Group (CHFG) has brought together leading experts in human factors
(HF), both academics and frontline clinicians, to identify and
promote best practice. It is an independent charitable trust (Registered
no 1123424). Its members believe that the lack of HF training
in medicine means that patient care is more hazardous than it
could and should be.
2.1 Many safety-critical industries have
reduced accidents and harm by knowledge of error theory and HF
principles. The inevitability of human error is accepted. Importantly
it is not viewed as poor performance, and so concealed, but openly
reported to drive improvement. Systems design and the appropriate
training of staff assure safety.
2.2 Training in HF skills such as teamwork
and communication is virtually absent in healthcare. It should
be mandated by regulation, taught and examined. The appropriate
professional bodies should be active partners in examining and
assessing competencies in non-technical skills (NTS) and HF for
both trainees and qualified staff.
2.3 Those who work together should train
together. Research has shown that teamwork training may reduce
technical errors by 30-50%.
2.4 There is a clear correlation between
HF skills and the frequency of error in operations. Minor errors
are frequently tolerated but they are significant as they accumulate
to cause major hazards. Minor errors must be recognized and reduced
by HF training.
2.5 Clinical staff must be involved in both
defining the problem and suggesting the solutions. Sustainable
improvements have been demonstrated by this approach. The prevailing
top-down culture of clinical governance does not work.
2.6 Blame invites denial among professionals
doing their best in poor systems. Central policy should ensure
a "fair blame" culture so that the reporting of error
is encouraged. Inadvertent error that is openly reported should
attract immunity from sanction.
2.7 We are concerned that the current model
whereby internal investigations are carried out by healthcare
staff after a short training. This is not a robust approach to
embedding incident investigation and results in a lack of independence.
The current approach should be reviewed by an expert from another
3.1 This evidence relates to the following
terms of reference:
3.1.1 What the risks to patient safety are
and to what extent they are avoidable
3.1.2 The effectiveness of education in ensuring
3.1.3 What the NHS should do next regarding
3.2 The evidence has been produced by a
working group of the CHFG;
3.2.1 Martin Bromiley, Chairman. Current
airline pilot and widower of the late Elaine Bromiley
3.2.2 Professor Rhona Flin, Professor of
Applied Psychology, University of Aberdeen.
3.2.3 Tony Giddings, Fellow of the Royal
College of Surgeons of England & Chairman of the Alliance
for the Safety of Patients.
3.2.4 Peter McCulloch, Reader in Surgery
and Ken Catchpole, Senior Post-Doctoral Scientist, Nuffield Department
of Surgery, Quality, Reliability, Safety & Teamwork Unit (QRSTU),
University of Oxford.
3.2.5 Krishna Moorthy, Clinical Senior Lecturer,
and Nick Sevdalis, Non Clinical Lecturer in Patient Safety, both
of the Division of Surgery, Oncology, Reproductive Biology and
Anaesthetics, Imperial College, London.
3.3 In many industries well-publicized disasters
have served as a turning point for safety. Piper Alpha, Three
Mile Island, and Hillsborough Stadium are examples. However in
healthcare only one person dies at a time and the death is rarely
investigated independently and with the depth required. However
in 2005 one such event occurred and was properly investigated;
it is now serving as a sentinel event.
4. HUMAN FACTORSREAL
4.1 Elaine Bromiley was admitted for elective
surgery for an ongoing serious sinus problem. At induction of
anaesthesia airway problems occurred. She was transferred unconscious
to ICU but died 13 days later having never regained consciousness.
In a letter from the Surgeon it was explained to her husband,
"I still do not see how we could have anticipated or avoided
the problems we encountered". Mr Bromiley accepted this but
(as would have been normal in aviation) he expected it to be investigated,
not to blame but to learn. He persuaded the hospital to do this.
Professor Michael Harmer, then President of the Association of
Anaesthetists of Great Britain and Ireland conducted a thorough
review. From this and from the Inquest a clear picture has emerged
(Harmer 2005, Coroner's Report 2005 & Bromiley 2008).
4.2 Pre-operative assessment did not identify
any problems but when anesthetised Elaine could not be ventilated.
Within 4 minutes she had become visibly blue, with oxygenation
falling to 40% (less than 90% is critical). Despite others arriving
to help, at 10 minutes Elaine had suffered critically low oxygen
levels for 6 minutes and all attempts to intubate had failed.
This was situation of "can't intubate, can't ventilate",
a recognised emergency in anaesthesia. Surgical access is vital.
4.3 The team were in a well-equipped theatre.
The principle anaesthetist had 16 years experience and was regarded
as "diligent" by his colleagues. The ENT Surgeon had
over 30 years experience; another anaesthetist who came to help
had additional skills pertaining to difficult airways and 3 of
the 4 nurses present were skilled in theatre or recovery. If this
emergency had to occur, this was arguably the ideal team.
4.4 Despite this we now know that the 3
consultants persisted with their attempts to intubate, apparently
to the exclusion of any other option and despite strong hints
from the nursing staff. Two of the nurses stated later that they
knew exactly what needed to be done "but didn't know how
to broach the subject". In his own words the lead anaesthetist
"lost control". There was a dispute in the Inquest about
who was felt to be in charge. Among the consultants, there was
a collective loss of situational awareness, especially of the
passage of time and the seriousness of the situation.
4.5 Failings in leadership, decision-making,
prioritisation, situational awareness, communication and teamwork
were identified at the inquest as leading directly to Elaine's
death. These same "human factors" are the direct cause
of 75% of aviation accidents. Many safety critical industries
refer to these NTS yet no member of this team, and virtually no
clinician in the UK receives any training in these vital skills.
4.6 This is not a problem of bad or negligent
people. As Martin Bromiley has said
"They are not bad people, they are not poor
clinicians. They were good people doing a good job who had the
technical skills to deal with what happened |.(but)|.not having
the benefit of the training and development available in other
industries, found themselves following a blind alley".
4.7 They lacked the knowledge and skills
of human factors.
5.1 The overall problem of human error/systems
failure in healthcare.
5.1.1 "|it is estimated that for one
patient in every 300 entering hospitals in the developed world,
medical error results in, or hastens death".
5.1.2 This arresting figure is from "Good
doctors, safer patients" published by the CMO for England
in July 2006. It based on calculations by Dr Lucian Leape of the
Harvard School of Public Health.
5.1.3 Adverse events are defined as an unintended
injuries caused by medical treatment rather than disease. Recent
hospital studies consistently show that 8-12% of patients suffer
an adverse event, half of which are preventable (Vincent, 2006).
A review by de Vries et al. (2007) showed that the most
such hospital events occur during or as a result of an operative
(40%) or medication (15%) error.
5.1.4 Thus 1 in 10 patients will be harmed
during an admission, over half due to operation or medication.
Surgery is associated with the greatest number of recorded errors;
obstetrics with the greatest financial cost (NHSLA). Acute care
is therefore the focus of this evidence, not because there are
no concerns in non- acute sectors but because they have no data.
These findings are consistent worldwide.
5.2 Human Error, Poor Clinical Judgement &
5.2.1 There is strong circumstantial evidence
that clinicians have been trained to believe that error should
not occur rather than to recognize that all humans make mistakes
and need skills to manage and avoid them. In the NHS error is
still viewed as weakness and poor performance (Bromiley, 2008).
It is illustrated by major under-reporting of adverse events (NAO
2005). Such a culture fosters denial and is the hallmark of hierarchy
in the workplace. No major safety-critical industry would accept
this at a strategic or operational level. The Civil Aviation Authority
states "human error is inevitablewhat is important
is to ensure that (this) does not result in adverse events such
as|accidents"; "specific training should concentrate
particularly on "error detection". (CAA CAP737 2006).
5.2.2 Studies of surgical adverse events
reveal a large number of causal factors embedded within a highly
complex system (Vincent, Moorthy, Sarker, Chang, & Darzi,
2004). There is an interplay of organisational, cultural and team
factors that constantly threaten safety. These factors, often
referred to as the "system" are critical to performance,
but all systems are operated by people. Improvement therefore
depends on educating and empowering them. With help they will
know how to make their own work safer. The nuclear and aviation
industry act as clear and successful examples. The same approach
is required in surgery (Giddings and Williamson 2007).
5.3 "Never" events & more common
5.3.1 Wrong site, wrong procedure, and wrong
person surgery are examples of avoidable and catastrophic events
which should "never" happen (Makary, 2006). Failure
in pre-operative communications between surgeons and anaesthetists
are common causes. The Joint Commission on Accreditation of Healthcare
Organisations found that 70% of wrong site events could have been
prevented by better communication. The incidence of retained foreign
body in surgery is 1 in 1000. It attracts considerable media attention
and censure in the surgical community; it is associated with 2%
risk of mortality and a re-operation rate of 69% (Gawande, 2003).
Failures in team communication are however only one aspect of
the systems failures in surgery. Routine surgical and anaesthetic
checks are not carried out, equipment problems are frequent and
adherence to basic procedures is variable (Healey, Sevdalis, &
Vincent, 2006). In the absence of pre-operative checks, crucial
equipment and prostheses are missing in many operating theatres.
5.3.2 The relatively rare and catastrophic
"never" events, however, represent only the tip of the
"iceberg of harm" caused by error and non-compliance
in surgical systems. Recognised preventable complications such
as deep venous thrombosis (DVT) and surgical site infection are
more likely where standard prophylactic measures are not carried
out. The degree to which other "inevitable complications"
of surgery are also attributable to process or compliance failures
are unknown but are likely to be significant. Reported frequency
varies by as much as 500% between Units and it is striking that
Units with the best outcomes are those with the best systems and
teamwork practices. As an example, DVT and pulmonary embolism
constitute 9% of adverse events (Gawande, 1999) but although guidelines
for DVT prophylaxis are widely available, adherence can be as
low as 30%.
6. THE IMPACT
TEAMWORK & COMMUNICATION)
6.1 A significant body of research on teamwork
and communication in operating theatres, confirms that communication
breakdown is frequent and hazardous (Christian et al 2006).
Problems shared with the aviation industry have been highlighted,
particularly difficulties with cultural hierarchy, which inhibits
team members from sharing their situational awareness clearly
in critical situations.
6.2 Studies in paediatric cardiac surgery
at Great Ormond Street showed a clear correlation between the
quality of teamwork and the frequency of technical and procedural
errors in operations (Catchpole et al. 2007) and this has
been confirmed by the QRSTU group in Oxford (Catchpole, Mishra
et al. 2008). Not surprisingly, operations where there
are a large number of minor technical errors are more likely to
result in a serious major problem.
6.3 Despite the differences between surgery
and civil aviation, in HF issues, there are striking similarities.
In Oxford, a detailed before and after training study has shown
that staff exposed to teamwork training based on aviation Crew
Resource Management (CRM) made 30-50% less technical errors after
training (McCulloch et al. 2008). The effect was variable,
but it is likely that changing team culture in the operating theatre
will reduce harm to patients.
6.4 There are however major differences
between the professional cultures of surgery and aviation. Resistance
to teamwork training in the Oxford study often came from well-respected,
highly professional doctors. To accept that error is inevitable,
and a systems approach is needed to prevent patient harm or that
other team members are important in protecting patients from one's
own errors may be seen as hurtful. Perhaps for this reason, teamwork
training proved unsustainable in Oxford, once the stimulus of
the study was removed. Many NHS professionals lack the culture
to embrace such change.
6.5 Sustainability may be assisted by "Lean
thinking" to improve safety practices on surgical wards.
This method, which stresses involvement of front line staff both
in defining the problem and producing the solution, has the potential
to effect culture change in healthcare. For example Kreckler et
al. (2008) have demonstrated sustainable improvements from
35% to 94% in compliance with thrombosis prevention.
7.1 Every day, every hour, every minute
there are uncontrolled events in healthcare which would not be
permitted in any other high risk industry. We need to ask why?
It is partly because healthcare is uniquely complex but it is
also because what could have been done to improve the training
of staff and the systems in which they work has not yet been done.
This is despite clear identification of the need for at least
eight years, since the publication of "An Organization with
a Memory" and the NHS Plan, to which there was a clear professional
response identifying "the central importance of improved
investment in surgical education and training" (FSSA Response
7.2 The case of Bethany Bowen illustrates
the continuing problems.
7.2.1 Bethany was a 5 year old girl who
suffered from hereditary spherocytosis, a condition sometimes
requiring removal of the spleen. Bethany died on the operating
table due to uncontrolled bleeding. At the operation the surgeon
and the assistant were using a morcellator, an instrument with
rotating blades, to fragment the spleen for easier removal. This
is an instrument more commonly used in gynaecology and is known
to be capable of serious internal injury. It appears that neither
the surgeon nor the assistant had had previous experience with
7.2.2 The death of this young child illustrates
two important and avoidable causes of failure in healthcare.
184.108.40.206 A series of failures on the part
of the surgeon; to display insight, to anticipate hazards and
to undertake the necessary training. This was not only a failure
as a surgeon but as a trainer and role model for a vulnerable
and inexperienced trainee and as the leader of a surgical team.
It raises serious issues about how surgeons see themselves, their
responsibilities for their patients, their trainees, and their
220.127.116.11 The case also illustrates a complete
failure of the system of clinical governance in this hospital.
The hospital had already been the subject of a recent Healthcare
Commission report into another surgical service. In both cases
HF issues may well have been at the root of the problems identified.
In theory such failures would be prevented by the operation of
effective clinical governance. Unfortunately it is generally the
case that the bureaucratic processes of clinical governance imposed
from above operate, as it were, in a parallel universe. They have
little effect on the behaviour of semi autonomous professionals
in the front line.
7.3 This is not to suggest that professional
dysfunction is widespread but where it exists it may be persistent
and ignored. Neither is it to suggest that professional autonomy
is inappropriate; indeed a degree of autonomy is essential to
the delivery of appropriate, patient-centred treatment but that
autonomy must be exercised within the boundaries of an overt and
effective system of governance. It must be balanced by clear accountability.
7.4 Effective education must be supported
by regulation, an appropriate curriculum, time, and money. It
also requires a workforce of trainers who have been selected and
trained to teach. It must also be quality- assured, assessed and
examined. In the generic and critically important field of HF
none of this has taken place.
7.5 The heavy reliance of the NHS on the
service contribution of trainees has led to a ratio of excess
trainees to fewer trainers, which is the reverse of that found
in every other developed country. There, trained doctors deliver
the majority of care.
8. WHAT THE
NHS SHOULD DO
8.1 The task is to improve individual and
team performance and the systems in which they work. But without
ownership and collaboration, workarounds, defensive routines,
bullying and negative dictatorship will continue.
8.2 Even sensible ideas such as the four
hour wait target for A&E may cause harm if applied in a mechanical
way by diversion of patients or the bullying of staff to comply,
while denying them the means to do so. Examples are common and
details known to us.
8.3 Staff must be given the skills to recognise
problems and the cultural freedom to express them. They must also
be encouraged and expected to provide the solutions and to manage
the process; clearly they will depend on the resources to monitor
and adapt as they progress. Solutions must be built into systems
that deliver evidence-based, best practice and protect patients
and staff from hazard. The focus should be on the process if we
are to achieve sustainable change.
8.4 A need to develop team working skills
will be too challenging for some senior clinicians, denied the
cultural barriers by which they have obscured their deficiencies
as leaders of their team or as members of the wider team. Others
must then replace them.
8.5 The NHS has benefited from huge recent
investment but little of that has been applied to the overwhelming
and generic need to change the systems in the workplace and the
culture of professionals.
8.6 If the safety of patients is to be improved,
a sustained and overt commitment to training in human factors
and systems improvement is unavoidable. This is a moral imperative
not a strategic option.
8.7 Specific recommendations are in our