Evaluating clinical skills of undergraduate pharmacy students using objective structured clinical examinations (OSCEs)

M Corbo, J P Patel, Abdel Tawab, J G Davies


Introduction: The objective structured clinical examination (OSCE) has been used for the competency assessment of clinical skills within the 4th year MPharm programme at the University of Brighton since 1999.

Aim: To evaluate the clinical performance of 4th year MPharm students, through two academic years.

Methods: Final year pharmacy undergraduate students were divided into 16 groups and completed an OSCE exam following a 1 week placement in a hospital. Each OSCE exam comprised of six workstations.

Results: Significant differences were found between the students’ performances at the individual OSCE stations (Chi-square 1⁄4 40.7; df 1⁄4 5; p , 0.01). Students performed best on patient counselling stations and least on calculation and problem identification and resolution type stations.

Conclusion: This study demonstrates that final year pharmacy undergraduates perform poorly in activities which demand an element of clinical problem identification and resolution or when performing a clinical calculation. These results suggest that a lack of clinical exposure may be, in part, responsible for the students’ perceived inability to deal with “real life” situations which involve clinical problem solving.


Objective structured clinical examination (OSCE), undergraduate students, clinical skills assessment, multiple choice question

Full Text:



Adcock, H. (2001). Why the four-year MPharm is a success. The Pharmaceutical Journal, 267, 115–116.

Bellingham, C. (2004). What the new contract has in store. The Pharmaceutical Journal, 273, 385.

Crossley, J., Humphris, G., & Jolly, B. (2002). Assessing health professionals. Medical Education, 36, 800–804.

Harden, R. M., & Gleeson, F. A. (1979). Assessment of clinical competence using objective structured clinical examination (OSCE). Medical Education, 13, 41–54.

Hodges, B., & McIlroy, J. H. (2003). Analytic global OSCE ratings are sensitive to level of training. Medical Education, 37, 1012–1016.

Martin, I. G., & Jolly, B. (2002). Predictive validity and estimates cut off score of an objective structured clinical examination (OSCE) used as an assessment of clinical skills at the end of the first clinical year. Medical Education, 36, 418–425.

McRobbie, D., & Davies, G. (1996). Assessing clinical compe- tence–a new method of evaluating hospital preregistration trainees. The Pharmaceutical Journal, 256, 908–910.

Newble, D. (2004). Techniques for measuring clinical competence: Objective structured clinical examinations. Medical Education, 38, 199–203.

Newble, D., Dawson, B., Dauphinee, D., Page, G., Macdonald, M., Swanson, D., Mulholland, H., Thomson, A., & van der Vleuten, C. (1994). Guidelines for assessing clinical competence. Teaching and Learning in Medicine, 6, 213–220.

Rutter, P. M. (2001). The introduction of observed structured clinical examinations (OSCEs) to the M.Pharm degree pathway. Pharmacy Education, 1, 173–180.

Rutter, P. M., & Brown, D. (2002). Observed structural clinical examinations: the views of preregistration trainees six month after graduating from Portsmouth University. The International Journal of Pharmacy Practise, 10(suppl), R48.

Tamblyn, R. M., Klass, D. J., Schnabl, G. K., & Kopelow, M. L. (1991). Sources of unreliability and bias in standardized-patient rating. Teaching and Learning in Medicine, 3, 74–85. Wass, V., van der Vleuten, C., Shatzer, J., & Jones, R. (2001). Assessment of clinical competence. The Lancet, 357, 945–949.


  • There are currently no refbacks.
article/comments.tpl article/footer.tpl