当前位置: 首页 > 期刊 > 《英国医生杂志》 > 2004年第20期 > 正文
编号:11357059
Four rules for the reinvention of health care
http://www.100md.com 《英国医生杂志》
     1 Centre for Health Informatics, University of New South Wales, Sydney, NSW 2055, Australia e.coiera@unsw.edu.au

    If health care is to evolve at a pace that will meet the needs of society it will need to embrace this science of sociotechnical design, but ultimately it is our culture's beliefs and values that shape what we will create and what we dream

    Futurists might like to speculate on what the health services of 2020 look like. The world may be such that as a clinician you work in flexible virtual teams and some of your colleagues are computers. You would of course instinctively mistrust clinicians who always know the answer without consulting the information grid, and patients often choose to be the team leader. Keyboards are banned as harmful and can be found in museums, next to punch cards and spittoons. The health record is a direct multimedia history of conversations, and a software agent is its curator. For the still cognitively limited clinician, your earring whispers your patient's name when you meet.

    More importantly, in 2020 the health system in most nations will have to treat proportionately more people, with more illness, using relatively fewer tax dollars and workers.1 Given that commentators today are alarmed at the current strains on the health system, we have to assume that by 2020 the healthcare systems in most nations will therefore either have somehow transformed substantially or will have failed. If health care is to flourish in the coming setting of diminished resources and increased demand, then it will do so because we have explicitly designed and implemented new systems of care that are fundamentally sustainable. Given the likely enormity of that task, it may require nothing less than the reinvention of health care.

    Many of the innovations needed for this reinvention are still unimagined today, but we can predict some of what must come to pass. In 2020 clinicians will care more effectively for more patients than today, because some burden of care has shifted away from individual clinicians. Some of that burden will rest with the consumer, who participates actively in maintaining good health and managing ill health. Some burden of care must also shift to machines, because without computational automation much of what needs to be done to make the system run will otherwise remain undone. Most importantly, our services and systems will need to be "designed" to meet our needs, in contrast to the inherited and patched up system we have at the present. As with other industries, by 2020 our designed processes will need to be certifiably safe and efficient. Prevention needs to be designed into the health system's core, eliminating many of the determinants of ill health that generate the current demand for services. By 2020 the current situation, in which healthcare delivery actually contributes to morbidity and mortality through avoidable error, should be seen as a wretched historical anomaly.

    The roles of existing healthcare professionals are also bound to change. Biomedical expertise, for example, will no longer be seen to reside in the heads of experts but will rather reside in the system. Knowing "about" is replaced by knowing "how to find out," and clinicians and machines are always "connected" to each other via the information grid to share knowledge and decisions and to form "just in time" teams to deal with specific problems or patients. Since health is so complex and expensive, new roles are needed, including health service brokers who help consumers navigate the health system and identify where the best care can be found. Evidence interpreters will help consumers find the evidence they need to make informed choices and help them understand the meaning of that evidence.

    This journey to reinvent health care begins by recognising that to design health services, we need to understand systems. The behaviour of a system emerges out of the interaction of its components, and the more components there are, the harder it is to predict the outcome of a seemingly simple change. The nature of complex systems such as health care means that simple fixes will always have unexpected consequences. The web of interactions needed to make anything work in a complex organisation always entails humans solving problems with limited resources and working around imperfect processes. Designing the technological tools that humans will use independently of the way in which the tools will affect the organisation optimises only solutions that are specific to local tasks and ignores global realities. The biggest information repository in most organisations sits in the heads of the people who work there, and the largest communication network is the web of conversations that binds them. Together, people, tools, and conversations—these form the "system."2

    Consequently, this science of health service design must be a science of sociotechnical systems,3 and today that science is called informatics.4 This call to design sociotechnical systems is as much a challenge to health care as it is to informatics, which still has a bias to technology driven innovation. Although the sociotechnical viewpoint has been around for about 50 years, technology is still king. The sacred ground of health informatics remains anything to do with the computer, the web, information architectures, the electronic health record, and heroic challenges such as the creation of enormous terminology systems. The profane ground of health informatics, still mostly shunned, is the world of politics, culture and persuasion, complaints from users when systems disappoint them, the messy craft of system implementation, which requires different tactics from one site to the next, and our unacceptably high number of system failures.5

    I propose four rules for the new sociotechnical informatics, which could help guide the active design of our health services.

    Rule 1: Technical systems have social consequences

    Introducing a technology into a setting affects not only the users it is specifically intended for but also the people surrounding them. For example, a doctor's use of a desktop computer in the consulting room can result in shortened and delayed responses to patients, reduced eye contact with patients, failure to hear patients' comments, and patients trying to judge when to talk to the doctor on the basis of his or her interactions with the computer.6 7

    The introduction of a computer based data gathering system for injuries in Kenya required a researcher on a motorbike to go to the injury site and document it by using a global positioning device.8 Unexpectedly, the elders in the villages now report that the number of assaults against women is down because the perpetrators know that every time they injure someone, an "official" on a motorbike comes and puts their name into a computer (W Odero, personal communication, 2003).

    Rule 2: Social systems have technical consequences

    The utility of technology is socially shaped. For example, the strongest predictor of email uptake in an organisation is not the software's intrinsic utility to help with communication tasks but whether your manager uses email.9 Similarly, online evidence systems are now increasingly in use, but uptake still varies widely, even between apparently similar organisations. In one study the only variables that seem to explain why some hospitals adopted online evidence systems and others did not had to do with the local culture of the organisations. The existence of local champions, teams that had a climate that supported innovation, and a culture supporting evidence based practice were evident in the organisations that adopted the technologies most readily.10 We can start to understand this class of phenomena by noting that people tend to treat computers and communications media as if they too were people.11 In other words, humans relate to the world with social rules and values and use these same rules to judge and interact with technologies.

    Rule 3: We don't design technology, we design sociotechnical systems

    If the social and the technical are inseparable the design of systems needs to change. We should no longer accept designs that are restricted to technological systems alone but broaden the scope of design to include social structures.12 Any new health service might (and probably must) entail innovation in clinical roles, work processes, and culture change as well as the new technologies drawn from the treasure chests of ehealth and bioinformatics.

    Consider, for example, the designer of an electronic health record who usually focuses on sculpting the interaction between a single clinical user and the record. However, other human agents also populate the interaction space. The user of the electronic medical record is often not the sole author of the content that is captured in the record but is recording the result of a set of discussions with other clinical colleagues. If the goal of designing an electronic medical record is to ensure that highest quality data are entered into the information system it may be even more important to support the collaborative discussion between clinicians than it is to engineer the act of record transcription into the system. Failing to model the wider interaction space for the electronic health record means that we may overengineer some interactions with diminishing returns, when we could be supporting other interactions that may deliver substantial additional benefit to our original design goals.2

    Rule 4: To design sociotechnical systems, we must understand how people and technologies interact

    A sizeable gap exists in the science of informatics relating to our understanding of health systems. Our capacity to model systems and predict the impact of new technologies within existing social systems is also primitive. Before we can rely on modelling and simulation methods, much like engineers use computer aided design (CAD) to design physical objects, we will need the raw data that describe the attributes of clinicians, clinical work, and the way that clinicians perform in a variety of environments. For example, it is now becoming clear that the work environment may impose unacceptable loads on human cognitive abilities and potentially lead to memory overload and error.13 Clinicians placed in an environment where they are busy and constantly interrupted by colleagues or synchronous technologies such as the telephone, pager, and email, are "designed" to produce error and inefficiency. Designers of busy clinical services thus need to factor in human cognitive limits and the work loads generated by other services over which they have no control.

    If health care is to evolve at a pace that will meet the needs of society it will need to embrace this science of sociotechnical design. Perhaps we begin the journey by designing a sustainable and flexible culture that does not fear innovation and sees the redesign of roles, processes, organisations, and careers as the first amongst all of its duties. Whether we are enraptured by the promise of technology or fear it, the way we choose to head will not be shaped by technology but by our will. It is our culture's beliefs and values that shape what we will create and what we dream.14

    Summary points

    Over the next 20 years, national health systems will have to treat proportionately more people, with more illness, using relatively fewer tax dollars and workers, yet these systems are already under significant strain

    To flourish in the coming setting of diminished resources and increased demand, we must design new systems of care that are fundamentally sustainable, and this may require nothing less than the reinvention of health care

    This journey to reinvent health care begins by recognising that to design health services, we need to understand both the behaviour of complex systems and the science of system design, which is increasingly associated with the discipline of health informatics

    Since health systems are sociotechnical systems, where outcomes emerge from the interaction of people and technologies, we cannot design organisational or technical systems independently of each other

    Contributors: EC is the sole author.

    Funding: None

    Competing interests: None declared.

    References

    Costello P. Intergeneration report, 2002-3 budget paper No. 5 Commonwealth of Australia, 2002. www.budget.gov.au/2002-03/bp5/html/index.html

    Coiera E. Interaction design theory. Int J Med Informatics 2003;69: 205-22.

    Trist EL. The evolution of sociotechnical systems as a conceptual framework and as an action research program. In: Van de Ven AH, Joyce WF, eds. Perspectives on organization design and behavior. New York: John Wiley, Wiley-Interscience, 1981: 19-75.

    Coiera E. Guide to health informatics. London: Hodder Arnold, 2003.

    Kaplan B, Brennan P, Dowling A, Friedman C, Peel V. Toward an informatics research agenda: key people and organizational issues. J Am Med Inform Assoc 2001;8: 235-41.

    Greatbatch D, Luff P, Heath C, Campion P. Interpersonal communication and human-computer interaction: an examination of the use of computers in medical consultations. Interacting with Computers 1993;5: 193-216.

    Booth N, Robinson P, Kohannejad J. Identifying successful communication skills in computer use in the consultation—the information in the consulting room project (iiCR), PHCSG Annual Conference Proceedings, Cambridge, United Kingdom, 2001: 90.

    Rotich J, Hannan T, Smith F, Bii J, Odero W, Vu N, et al. Installing and implementing a computer-based patient record system in sub-Saharan Africa: the Mosoriot medical record system. J Am Med Inform Assoc 2003;10: 295-303.

    Markus ML. Electronic mail as the medium of managerial choice. Organisation Sci 1994;5: 502-27.

    Gosling AS, Westbrook JI, Coiera EW. Variation in the use of online clinical evidence: a qualitative analysis. Int J Med Informatics 2003;69: 1-16.

    Reeves B, Nass C. The media equation. Cambridge: Cambridge University Press, 1996.

    Bostrom RP, Heinen JS. MIS problems and failures: a socio-technical perspective. Part II: The application of socio-technical theory. MIS Q 1977;December: 11-28.

    J. Parker, E. Coiera, Improving clinical communication: a view from psychology. J Am Med Informatics Assoc 2000;7: 453-61.

    Coiera E. The impact of culture on technology. Med J Austr 1999;171: 508-9.(Enrico Coiera, professor1)