当前位置: 首页 > 期刊 > 《英国医生杂志》 > 2004年第18期 > 正文
编号:11354480
A conceptual framework may be of limited value
http://www.100md.com 《英国医生杂志》
     1 Department of clinical epidemiology and biostatistics, McMaster University, Hamilton, Canada norman@mcmaster.ca

    Straus et al provide a conceptual framework for evaluation of strategies for teaching evidence based medicine (EBM).1 They correctly state that there is little evidence of effectiveness of teaching EBM, a deficiency frequently identified by critics.2 3 The authors' assumption is that provision of a conceptual framework will lead to better studies. But will it? Is it really the case that a conceptual framework leads naturally to well designed studies? If so, this represents a reorientation of the first author, who previously stated that "no investigative team has yet overcome the problems of sample size, contamination and blinding that such a trial raises,"3 which puts the problem squarely in the court of methodology. And I think she is at least partially right.4 It is well nigh impossible to conceive of an effective educational intervention where the teachers were standardised and participants were blinded, hence unaware that they had received the intervention.

    This does not preclude the possibility that a conceptual framework may help. But it seems to me that the major consequence may be to so impress upon potential researchers the daunting nature of the task facing them that it may stimulate abandonment of research rather than initiation. A quick calculation from Straus et al's table 1 shows that, if you were serious about doing a study aimed at "doers," you would have to get reliable and valid information from each of 5x5 = 25 cells. And that is a substantial problem. While the authors claim that the article's focus is on "psychometrically strong measurements," that is the last time that psychometric issues are raised. Others have suggested that EBM studies suffer from a "lack of validated outcome measures," particularly those that focus on learner behaviours,5 but there is no mention of that here.

    In the end, I suspect that if a trial, using good design and psychometrically defensible instruments, showed that those who had a course in EBM actually delivered better care than those who did not, all the assessment of attitudes, knowledge, and skills elucidated in this article would probably be viewed as irrelevant. I await the day.

    Funding: None.

    Competing interests: None declared.

    References

    Straus SE, Green ML, Bell DS, Badgett R, Davis D, Gerrity M, et al for the Society of General Internal Medicine Evidence-Based Medicine Task Force. Evaluating the teaching of evidence based medicine: conceptual framework. BMJ 2004;329: 1029-32.

    Norman G, Shannon SI. Effectiveness of instruction in critical appraisal skills: a critical appraisal. CMAJ 1998;158: 177-81.

    Straus SE, McAlister FA. Evidence-based medicine: a commentary on common criticisms. CMAJ 2000;163: 837-41.

    Norman GR. RCT = results confounded and trivial: the perils of grand educational experiments. Med Educ 2003;37: 582-4.

    Hatala R, Guyatt G. Evaluating the teaching of evidence-based medicine. JAMA 2002;288: 1110-3.(Geoff Norman, professor1)