当前位置: 首页 > 期刊 > 《英国医生杂志》 > 2004年第18期 > 正文
编号:11354481
Evaluating the teaching of evidence based medicine: conceptual framework
http://www.100md.com 《英国医生杂志》
     1 Department of medicine, Toronto General Hospital, 200 Elizabeth Street, 9ES-407, Toronto, Ontario M5G 2C4, Canada, 2 Department of internal medicine, Yale University School of Medicine, New Haven, CT, USA, 3 Department of medicine, David Geffen School of Medicine, UCLA, Los Angeles, CA, USA, 4 Department of medicine, University of Texas Health Science Centre at San Antonio, TX, USA, 5 Department of health policy, management and evaluation, University of Toronto, Toronto, Canada, 6 Department of medicine, Oregon Health Sciences University, Portland OR, USA, 7 Washington DC VA Medical Centre, Washington, DC, USA, 8 Department of medicine, VA Medical Affairs, Birmingham, AL, USA, 9 Department of medicine, University of Chicago, Chicago, IL, USA, 10 Department of medicine, University of Michigan Medical School, Ann Arbor, Michigan, USA

    Correspondence to: S E Straus sharon.straus@utoronto.ca

    Although evidence for the effectiveness of evidence based medicine has accumulated, there is still little evidence on what are the most effective methods of teaching it.

    Introduction

    Learners can be doctors, patients, policy makers, or managers. This article focuses on doctors, but our evaluation framework could be applied to other audiences.

    Credit: PHILIP SIMPSON/PHOTONICA

    Not all doctors want or need to learn how to practise all five steps of EBM (asking, acquiring, appraising, applying, assessing).4 5 Indeed, most doctors consider themselves users of EBM, and surveys of clinicians show that only about 5% believe that learning all these five steps is the most appropriate way of moving from opinion based to evidence based medicine.4

    Doctors can incorporate evidence into their practice in three ways.3 6 In a clinical situation, the extent to which each step of EBM is performed depends on the nature of the encountered condition, time constraints, and level of expertise with each of the steps. For frequently encountered conditions (such as unstable angina) and with minimal time constraints, we operate in the "doing" mode, in which at least the first four steps are completed. For less common conditions (such as aspirin overdose) or for more rushed clinical situations, we eliminate the critical appraisal step and operate in the "using" mode, conserving our time by restricting our search to rigorously preappraised resources (such as Clinical Evidence). Finally, in the "replicating" mode we trust and directly follow the recommendations of respected EBM leaders (abandoning at least the search for evidence and its detailed appraisal). Doctors may practise in any of these modes at various times, but their activity will probably fall predominantly into one category.

    The various methods of teaching EBM must therefore address the needs of these different learners. One size cannot fit all. Similarly, if a formal evaluation of the educational activity is required, the evaluation method should reflect the different learners' goals. Although several questionnaires have been shown to be useful in assessing the knowledge and skills needed for EBM,7 8 we must remember that learners' knowledge and skills targeted by these tools may not be similar to our own. The careful identification of our learners (their needs and learning styles) forms the first dimension of the evaluation framework that we are proposing.

    What is the intervention?

    Effective teaching of EBM will produce a wide range of outcomes. Various levels of educational outcomes could be considered, including attitudes, knowledge, skills, behaviours, and clinical outcomes. The outcome level (the third dimension of the conceptual framework) reflects Miller's pyramid for evaluating clinical competence12 and builds on the competency grid for evidence based health care proposed by Greenhalgh.13 Changes in doctors' knowledge and skills are relatively easy to detect, and several instruments have been evaluated for this purpose.7 8 However, many of these instruments primarily evaluate critical appraisal skills, focusing on the role of "doer" rather than "user." A Cochrane review of critical appraisal teaching found one study that met the authors' inclusion criteria and that the course studied increased knowledge of critical appraisal.10 With our proposed framework, evaluation of this teaching course falls into the learner domain of "doing," the intervention domain of "appraisal," and the outcome domain of "knowledge."

    Changes in behaviours and clinical outcomes are more difficult to measure because they require assessment in the practice setting. For example, in a study evaluating a family medicine training programme, doctor-patient interactions were videotaped and analysed for EBM content.14 A recent before and after study has shown that a multi-component intervention including teaching EBM skills and providing electronic resources to consultants and house officers significantly improved their evidence based practice (Straus SE et al, unpublished data). With our proposed framework, evaluation of this latter teaching intervention would be categorised into the learner domain of "doing." The intervention domains include all five steps of EBM, and the outcome domain would be "doctor behaviour."

    Implementing the evaluation framework

    Our model requires that teachers work with learners to understand their goals, to identify in what mode of practice they want to enhance their expertise, and to determine their preferred learning style. This simple model could be expanded to include other dimensions, including the role of the teacher and the "dose" and "formulation" of what is taught. However, our primary goal was to develop a matrix that was easy to use. Although we have applied this framework to several of the published evaluation instruments and have found it to be useful, others may find that it does not meet all of their requirements.

    What's next?

    Hatala R, Guyatt G. Evaluating the teaching of evidence-based medicine. JAMA 2002;288: 1110-2.

    Society of General Internal Medicine. Evidence based medicine. www.sgim.org/ebm.cfm (accessed 1 Oct 2004).

    Sackett DL, Straus SE, Richardson WS, Rosenberg WMC, Haynes RB. Evidence-based medicine: how to practice and teach EBM. London: Churchill Livingstone, 2000.

    McColl A, Smith H, White P, Field J. General practitioners' perceptions of the route to evidence-based medicine: a questionnaire survey. BMJ 1998;316: 361-5.

    McAlister FA, Graham I, Karr GW, Laupacis A. Evidence-based medicine and the practicing clinician: a survey of Canadian general internists. J Gen Intern Med 1999;14: 236-42.

    Straus SE, McAlister FA. Evidence-based medicine: a commentary on common criticisms. CMAJ 2000;163: 837-41

    Fritsche L, Greenhalgh T, Falck-Ytter Y, Neumayer H, Kunz R. Do short courses in evidence based medicine improve knowledge and skills? Validation of Berlin questionnaire and before and after study of courses in evidence based medicine. BMJ 2002;325: 1338-41.

    Ramos KD, Schafer S, Tracz SM. Validation of the Fresno test of competence in evidence based medicine. BMJ 2003;326: 319-21.

    Rosenberg WM, Deeks J, Lusher A, Snowball R, Dooley G, Sackett D. Improving searching skills and evidence retrieval. J R Coll Physicians Lond 1998;32: 557-63.

    Parkes J, Hyde C, Deeks J, Milne R. Teaching critical appraisal skills in health care settings. Cochrane Database Syst Rev 2001;(3): CD001270.

    Green ML. Graduate medical education training in clinical epidemiology, critical appraisal and evidence-based medicine: a critical review of curricula. Acad Med 1999;74: 686-94.

    Miller GE. The assessment of clinical skills/competency/performance. Acad Med 1990;65(9 suppl): S63-7.

    Greenhalgh T, Macfarlane F. Towards a competency grid for evidence-based practice. J Eval Clin Pract 1997;3: 161-5.

    Ross R, Verdieck A. Introducing an evidence-based medicine curriculum into a family practice residency—is it effective? Acad Med 2003;78: 412-7

    Kunz R, Fritsche L, Neumayer HH. Development of quality assurance criteria for continuing education in evidence-based medicine. Z Arztl Fortbild Qualitatssich 2001;95: 371-5.(Sharon E Straus, associat)