XClose

UCL News

Home
Menu

UK medical school variations in graduates' clinical performance

15 February 2008

Links:

chris mcmanus ucl.ac.uk/medical-education/" target="_self">Professor McManus
  • Professor Dacre
  • Read full paper
  • A new study by led by Professor Chris McManus (UCL Psychology) has found that UK medical graduates show substantial differences in clinical performance depending on which school they attended.

    The findings, published in BMC Medicine, support the argument that British doctors should be licensed by taking a national examination.

    The team assessed the performance of graduates from 19 UK universities who have taken the Membership of the Royal Colleges of Physicians (MRCP UK) parts 1 and 2, which are multiple-choice assessments, and PACES, an assessment - using real and simulated patients - of clinical examination and communication skills, and explored the reasons for the differences between medical schools.

    Professor McManus explained: "The General Medical Council (GMC) has explored the possibility of a national medical licensing examination in the UK, as exists in the US. Our study provides a strong argument for introducing one, as we have shown that graduates from different medical schools perform markedly differently in terms of their knowledge, clinical and communication skills".

    Parts 1 and 2 performance of Oxford, Cambridge and Newcastle-upon-Tyne graduates was significantly better than average, and the performance of Liverpool, Dundee, Belfast and Aberdeen graduates was significantly worse than average. In the PACES examination, Oxford graduates performed significantly above average, and Dundee and Liverpool graduates significantly below average. 91 per cent of Oxford, 76 per cent of Cambridge and 67 percent of Newcastle graduates passed part 1 at their initial attempt, compared with 32 per cent and 38 per cent of those from Liverpool and Dundee.

    The medical school was not the only factor influencing performance in the 5,827 doctors included in the research, as males outperformed females on the multiple choice examinations, and vice versa on the clinically based PACES stage of the exam. The team also examined whether differences in medical school pre-admission qualifications could explain the differences between medical schools, and found that they did so only in part, suggesting that differences between the teaching focus, content and approaches of the medical schools themselves also play a role.

    Professor Jane Dacre (UCL Academic Centre for Medical Education), explained: "Although the MRCP (UK) is a widely regarded exam that is carefully designed to assess a wide range of knowledge and skills required by a physician, it is possible that some medical schools teach other important skills that this examination does not assess. Our data do show that there is a real need for routine collection and audit of performance data of UK medical graduates, both in postgraduate exams such as the MRCP (UK) and probably also by a national licensing exam."

    To find out more, use the links at the top of this article

    Image: Professor McManus