当前位置: 首页 > 期刊 > 《健康质量安全杂志》 > 2005年第1期 > 正文
编号:11119716
Human Error and Patient Safety
http://www.100md.com 《健康质量安全杂志》
     Keywords: human error; patient safety

    UNDERSTANDING OURSELVES IN THE HEALTHCARE SYSTEM: PSYCHOLOGICAL INSIGHTS

    Many people working in health care know very little about the human and organisational precursors of error. But, as technology advances and both workloads and complexity in health care increase, the risk of error and adverse patient outcome grows. In the face of these trends, public expectations of health care are rising and tolerance of error is diminishing. The paper by Professor James Reason, although focusing on anaesthetic mishaps, contains generic information that should now be considered a required part of the undergraduate and postgraduate medical curricula.

    Learning from others

    Health care has been characterised by its "silo" thinking! All around it in the community other professions and occupations—such as aviation, nuclear power plant operations, military command, fire prevention, rescue organisations—have developed and employed successful safety measures that are directly applicable to many healthcare activities. Until recently, medical workers took little notice. But we are learning now via the psychology pipeline. For example, some disciplines—led by dentistry,1 nursing,2 and anaesthesia3—have already made effective use of the powerful technique of incident reporting and analysis.4

    From such data there is a dawning realisation of the significant role in health care error production of the "latent error" and of "system failure", rather than simply an error by a person at "the sharp end".5,6 As Reason has pointed out, blame of such an individual implies delinquency and this is usually dealt with by measures carrying a "shaming" flavour. This has no remedial value (it is often actually counterproductive) at the level of the individual, who "... did not choose to err in the first place". (It may be observed here that such understanding appears still to lag within some legal circles, despite centuries of their own evidence.)

    Signs of a culture change

    Hand in hand with these enlightenments is the gradual culture change occurring in health care, which is learning to admit that human error is inevitable, universal and "... a completely normal and necessary part of human cognitive function".7 Incident, near miss, and adverse event data are now being meaningfully gathered. This is leading to the development and application of effective preventive strategies—that is, derived from real world data—particularly against such "system" or "organisational" failures. The result is some real improvements in patient and/or staff safety.8–10 Such appropriate use of incident data has been referred to as "closing the loop".

    The need for humility

    The earlier absence from medical and nursing curricula of patient safety as a discrete topic, and of any instruction in human error psychology, bred generations of medical and nursing personnel lacking insight into such matters. This "latent knowledge based mistake" has since combined with several "sharp end" triggers. These have included vast technological advances, large increases in the healthcare workload, greatly increased public expectations and, sadly, a previously perpetuated (albeit ridiculous) culture in medicine of "doctor infallibility". The combinations have contributed towards the awful level of iatrogenic injury (and "near miss") rates with which we are presently struggling.11

    Happily, such insights are bringing with them a long overdue reduction in medical arrogance, an increase in humility and, perhaps most importantly, the recognised need for early, full, and open communication with affected patients, their families, and their carers following any healthcare error. (This last mentioned trend, hardly surprisingly, carries with it a quite profound potential for diminishing patient and family dissatisfaction and their tendency to sue when an error occurs.)

    Of course these medical self-criticisms (mea culpa) may exhibit some degree of "hindsight bias". For one is now aware of the "outcome"—the unacceptably high rates of iatrogenic harm—despite most medical and nursing folk having generally tried their hardest over the years to do the right thing. Back then we were "armed only with foresight"! Many healthcare workers (and patients and their relatives) knew things were not right, but they did not know what to do about it, other than to apportion individual blame. Again in Reason’s words, "... It usually takes someone else, with a fresh view of the situation, to detect a deviation from some adequate path."

    A "classic" paper

    This paper, in a nutshell, condenses what every healthcare worker should understand about errors and their prevention. This must include the people within the higher echelons of the healthcare system—politicians, administrators, planners, lawyers and senior consultants. The paper carries generic truths that will be as valid in 100 years’ time as they were in 1995 and are today. This is because the focus is on how humans basically think and behave. It will indeed be a long time before there are any changes in these attributes!

    REFERENCES

    O’Donnell RJ. The development and evaluation of a test for predicting dental student performance. Univ Pittsburgh Bull 1953;49:240–3.

    Safren MA, Chapanis A. A critical incident study of hospital medication errors: Parts 1 and 2. Hospitals 1960;34:32–453.

    Cooper JB, Newbower RS, Long CD, et al. Preventable anesthesia mishaps: a study of human factors. Anesthesiology 1978;49:399–406.

    Flanagan JC. The critical incident technique. Psychol Bull 1954;51:327–58.

    Runciman WB, Webb RK, Lee R, et al. System failure: an analysis of 2000 incident reports. Anaesth Intens Care 1993;21:684–95

    Reason J . Safety in the operating theatre – Part 2: Human error and organisational failure. Current Anaesth Crit Care 1995;6:121–6.

    Allnutt MF. Human factors in accidents. Br J Anaesth 1987;59:856–64.

    Runciman WB, Sellen A, Webb RK, et al. Errors, incidents and accidents in anaesthetic practice. Anaesth Intens Care 1993;21:506–19.

    Australian Patient Safety Foundation. Crisis management manual: cover ABCD – a swift check. Adelaide: Australian Patient Safety Foundation, 1996.

    Pirone CJ, Goble S, Bullock M. The incidence of ear barotrauma in hyperbaric medicine. Analysis of 191 reports from the HIMS data. HTNA, Program and Abstracts of the 9th Annual Scientific Meeting on Diving and Hyperbaric Medicine 2001:26.

    Runciman WB, Moller J. Iatrogenic injury in Australia. Adelaide: Australian Patient Safety Foundation, 2001.(J Williamsonand P Barach)