当前位置: 首页 > 期刊 > 《新英格兰医药杂志》 > 2006年第21期 > 正文
编号:11342489
Building on Experience — The Development of Clinical Reasoning
http://www.100md.com 《新英格兰医药杂志》
     As medical students become physicians, they need to learn to diagnose and manage clinical problems — a process often referred to as developing clinical reasoning skills. Researchers have been exploring the nature of clinical diagnostic reasoning for more than three decades. The initial interest was sparked by a new generation of medical schools, such as those at McMaster University and Michigan State University, whose curricula were explicitly directed toward teaching and learning about "clinical problem-solving." Little was known about the process, but the belief was that if it were better understood, we could teach it more effectively. In this issue of the Journal, Bowen points out the irony that although early research on clinical reasoning was predicated on a desire to improve teaching, practical pedagogic implications were rarely considered.1 This criticism still holds: although the field has undergone several shifts in focus, we are only now in a position to provide any sound, evidence-based pedagogic advice.

    Early research was based on the assumption that expertise resided in the acquisition of general strategies or heuristics — clinical problem-solving skills — possessed by experts and strived for by students. Alas, it was not so.2 Success in solving a particular clinical problem was soon shown to be a poor predictor of success in solving the next one. Elstein et al. labeled the phenomenon "content specificity," a term that implied that success in problem solving was strongly related to having the right kind of content knowledge.3

    Researchers responded by searching for the kinds of knowledge that distinguished experts from novices. In some respects, such a pursuit seems self-evidently worthwhile — as in the two characteristic examples presented by Bowen. More experienced learners do describe cases in ways that are different from those used by less experienced learners, and surely the differences must reflect the knowledge that accompanies their expertise.

    Or do they? The answer becomes more complex — and complicated — as Bowen attempts to describe all the ways that expertise influences verbal descriptions of cases given by clinicians at varying stages of their training. As she points out, various researchers have identified clinical expertise with problem representations, illness scripts, semantic qualifiers, pattern recognition, and patient prototypes. I could add a few other representations: mental matrices based on Bayesian probabilities, reasoning schemes based on decision trees, and causal reasoning in the form of multiple "if–then" rules. Perhaps Bowen's most insightful comment is that "clinicians often unconsciously use multiple, combined strategies to solve clinical problems, suggesting a high degree of mental flexibility and adaptability in clinical reasoning."

    Indeed they do. If we ask experts to create reasoning schemes for us, describe a case using semantic qualifiers, or estimate probabilities, they are more than willing to oblige. But this should hardly be surprising. We all possess all kinds of knowledge of all kinds of things. An expert clinician has access to multiple knowledge representations about many diseases, ranging from pathophysiological descriptions to the appearance of the last patient he or she saw with a particular condition.4

    Furthermore, diagnostic success may be a result of processes that can never be described by the clinician. If the right diagnosis arises from pattern recognition, clinicians are unlikely to be able to tell you why they thought the patient had gout, any more than we can say how we recognize that the person on the street corner is our son.5 Bowen claims that "strong diagnosticians can generally readily expand on their thinking"; I believe, instead, that strong diagnosticians can tell a credible story about how they might have been thinking, but no one, themselves included, can really be sure that it is an accurate depiction.

    What, then, can we fairly say about the complexity of the clinical reasoning process? First, expertise is not a matter of acquiring some kind of general, all-inclusive reasoning strategy. As a result, trying to teach or evaluate "clinical problem-solving" or "clinical reasoning skills" is quixotic. Knowledge counts. But, second, no one kind of knowledge counts more than any other. We should encourage students to describe cases concisely and to use medical terms to show that they understand how the patient's words translate into accepted medical equivalents and how they are linking the case to their formal knowledge. But we should not be so puritanical as to assume that one kind of formal knowledge — be it reasoning scheme, illness script, or semantic qualifier — can claim supremacy. Several paths may lead to the same destination.

    Third, expertise in medicine, as in any craft, derives from both formal and experiential knowledge. The process of pattern recognition, so characteristic of an expert's approach, is a product of extensive experience with patients overlaid on a formal knowledge structure. It takes both kinds of knowledge to achieve success, and both are used by experts. For this reason, clinical teachers should abandon the mythical ideal of the clinician as an objective, impassive observer and instead should encourage learners at all levels to use their experience to guide their search. Explicitly encouraging students to use both analytical rule knowledge and experiential knowledge has been shown to be an effective pedagogic strategy.6 Put simply, there is no substitute for experience, even the limited experience of novice clinicians.

    I think we have tended to discount the experiential component of clinical expertise, dismissing it as mere pattern recognition and disparaging experts who are guided by experience instead of the latest evidence-based systematic review. Our current understanding of medical expertise suggests that this bias is misguided; a critical element of becoming an expert is accruing the vast experience that enables experts to recognize patterns effortlessly most of the time — and to recognize, as well, when the signs and symptoms do not fit a pattern at all. If we do little more than legitimize experiential knowledge and encourage teachers to emphasize and explain its importance to their students, instead of insisting that students gather endless lists of signs and symptoms in a mindless "complete history and physical," this seemingly small step forward will be an important accomplishment in medical education.

    No potential conflict of interest relevant to this article was reported.

    Source Information

    Dr. Norman is a professor of clinical epidemiology, biostatistics, and psychology and the assistant dean of the Programme for Educational Research and Development at McMaster University, Hamilton, ON, Canada.

    References

    Bowen JL. Educational strategies to promote clinical diagnostic reasoning. N Engl J Med 2006;355:2217-2225.

    Norman G. Research in clinical reasoning: past history and current trends. Med Educ 2005;39:418-427.

    Elstein AS, Shulman LS, Sprafka SA. Medical problem solving: an analysis of clinical reasoning. Cambridge, MA: Harvard University Press, 1978.

    Hassebrock F, Johnson PE, Bullemer P, et al. When less is more: representation and selective memory in expert problem-solving. Am J Psychol 1993;106:155-189.

    Hatala RM, Norman GR, Brooks LR. Influence of a single example upon subsequent electrocardiogram interpretation. Teach Learn Med 1999;11:110-117.

    Ark TK, Brooks LR, Eva KW. Giving learners the best of both worlds: do clinical teachers need to guard against teaching pattern recognition to novices? Acad Med 2006;81:405-409.(Geoffrey Norman, Ph.D.)