当前位置: 首页 > 期刊 > 《英国医生杂志》 > 2005年第14期 > 正文
编号:11385438
Ensuring medical students are "fit for purpose"
http://www.100md.com 《英国医生杂志》
     It is time for the UK to consider a national licensing process

    "The intent to develop liberally educated graduates, rather than competent technicians, is what makes a university a university."1

    This statement stands to be challenged. Postgraduate medical training in the United Kingdom is undergoing profound change as the Modernising Medical Careers project introduces a generic competency based curriculum for all newly graduated (foundation) doctors. A new culture of assessment is developing which increasingly focuses on testing clinical skills in the workplace.2 Licensing processes will be quality assured by the Postgraduate Medical Education Training Board (PMETB). The principles formulated for this encourage reliable, well designed assessments mapped against the requirements of the General Medical Council's (GMC) Good Medical Practice guidance (www.gmc-uk.org/med_ed/default.htm), appropriate standard setting, lay involvement, and transparency of process for candidates (www.pmetb.org.uk). Royal colleges are reviewing their accreditation processes to meet these requirements.

    Against this background it is time to reconsider undergraduate examinations for UK medical students. In contrast to the United States and Canada, where national licensing examinations are held, the GMC has fostered the individualism and liberal education of the universities. UK medical schools are free to develop their own courses and assessments in line with the GMC's recommendations. There is no common curriculum. Contrasting educational approaches have developed as new medical schools adopt more community based, behavioural science orientated, and integrated courses3 while others turn to new instruction methods, such as problem based learning. Significant variation in examinations has been highlighted.4 It is generally accepted that multiple methods are needed to assess all aspects of clinical competency. Universities inevitably approach this differently to mirror the philosophy of their curriculums but not necessarily with the psychometric support needed to ensure robust reliability.5

    We cannot assume that these different approaches produce graduates of equivalent technical competency. Standards for pass/fail are beginning to be set within medical schools but there are no systems to standardise these across schools. Members of the public and NHS employers must be assured that doctors entering the foundation programme are clinically competent. Overseas doctors have to pass the Professional and Linguistic Assessment Board (PLAB) examinations to prove they are fit to practise. Yet the guarantee that all UK graduates are clinically competent rests on asking colleagues to act as external examiners, a system which is open to criticism. It is difficult to challenge one's peers.

    Evidence is indeed emerging that standards might differ between medical schools (Boursicot KAM et al, Association for the Study of Medical Education Annual Scientific Meeting 2005; www.asme.org.uk/conf_courses/2004/docs_pix/asm04). Comparison of performance by medical school in the examination for membership of the Royal College of General Practitioners (MRCGP) shows a marked difference in candidates' achievements when analysed by medical school of qualification.6 Evidence also suggests that student training requirements are not uniformly addressed: in a recent survey only a third of preregistration house officers agreed that they had been well prepared for their work. There was a remarkable variation in response across medical schools.7

    A national licensing process must now be considered. There is well documented evidence from North America that licensing examinations can predict and set standards for competency.8 9 Both the Medical Council of Canada and the United States Medical Licensing Examination have improved the validity of their tests, incorporating clinical components designed to evaluate patient centred approaches to care and good communication. Development resources can be shared and psychometric support made available on a scale difficult for individual universities but comparable with that seen in the GMC's PLAB examination. In face of rising student numbers and work pressures on staff, this is increasingly an issue for medical schools. To assure the public that their students are fit for purpose, universities must either accept that adequate resources (people and money) must be identified to fund assessment or look to the GMC to take responsibility for a national licensing process.

    Applying the principles of the Postgraduate Medical Education Training Board to current UK undergraduate assessments would inevitably instigate other changes. UK universities have been slow to include lay representation on examination and "fitness to progress" committees. The patient's voice must be heard if transparent, credible standards are to be set for exit from medical school into clinical practice. Similarly evidence is emerging from the United States that poorly performing doctors are more likely to have exhibited attitudinal problems in medical school.10 More robust systems for assessing these attitudinal behaviours would enhance the authenticity of these procedures. A national licensing process for clinical competency need not be at the expense of a liberal education. It creates the potential to free up time, allow more opportunity to assess and remediate inappropriate attitudes in the work place, and foster personal professional development.

    We need open debate on the feasibility of setting a national competency based curriculum for medical schools to achieve a correct balance between scientific knowledge and patient centred care. It is time to consider pooling expertise and resources, agreeing a national undergraduate assessment package and implementing this at a standard students, patients, and educators can all be proud of. If the requirements of Good Medical Practice are to be upheld, patients need the assurance that medical schools are producing doctors with the correct competencies at an appropriate standard.

    Val Wass, professor of community based medical education

    University of Manchester, Manchester, Rusholme Health Centre, Manchester M14 5NP

    (valerie.wass@manchester.ac.uk)

    Competing interests: None.

    References

    Curry L, Wergin JF. Educating professionals. San Francisco: Jossey-Bass, 1993: 126.

    Van der Vleuten CPM, Schuwirth LWT. Assessing professional competence: from methods to programmes. Med Educ 2005;39: 309-17.

    Howe A, Campion P, Searle J, Smith H. New perspectives—approaches to medical education at four new UK medical schools. BMJ 2004;329: 327-31.

    SL Fowell, G Maudsley, P Maguire, SJ Leinster, J Bligh. Student assessment in undergraduate medical education in the United Kingdom 1998. Med Educ 2000;34(suppl): S1-49.

    Wass V, McGibbon D, Van der Vleuten CPM. Composite undergraduate clinical examinations: how should the components be combined to maximise reliability? Med Educ 2001;35: 326-30.

    Wakeford R, Foulkes J, McManus C, Southgate L. MRCGP pass rate by medical school and region of postgraduate training. BMJ 1993;307: 542-3.

    Goldacre MJ, Lambert T, Evans J, Turner G. Preregistration house officers' views on whether their experience at medical school prepared them well for their jobs: national questionnaire survey. BMJ 2003;326: 1011-2.

    Norcini JJ, Lipner RS, Kimball HR. Certifying examination performance and patient outcomes following acute myocardial infarction. Med Educ 2002;36: 853-9.

    Ramsey PG, Carline JD, Inui TS, Larson EB, LoGerfo JP, Wenrich MD. Predictive validity of certification of by the American Board of Internal Medicine. Ann Intern Med 1989;110: 719-26.

    Papadakis M A, Hodgson C S, Teherani A, Kohatsu N D. Unprofessional behaviour in medical school is associated with subsequent disciplinary action by a state medical board. Acad Med 2004;79: 244-9.