Hostname: page-component-745bb68f8f-hvd4g Total loading time: 0 Render date: 2025-02-11T08:53:18.777Z Has data issue: false hasContentIssue false

Validation and evaluation of new assessments for the otolaryngology undergraduate medical clerkship

Published online by Cambridge University Press:  28 September 2015

R Woods*
Affiliation:
Department of Otolaryngology, Trinity College Dublin, Ireland
T Subramaniam
Affiliation:
Department of Otolaryngology, Trinity College Dublin, Ireland
A Patterson
Affiliation:
School of Medicine, Trinity College Dublin, Ireland
M Hennessy
Affiliation:
School of Medicine, Trinity College Dublin, Ireland
C Timon
Affiliation:
Department of Otolaryngology, Trinity College Dublin, Ireland
*
Address for correspondence: Mr Robbie Woods, Trinity Department of Otolaryngology, Royal Victoria Eye and Ear Hospital, Adelaide Road, Dublin 2, Ireland E-mail: woodsr@tcd.ie
Rights & Permissions [Opens in a new window]

Abstract

Objectives:

To validate and evaluate a short answer question paper and objective structured clinical examination. Validity and effect on overall performance were considered.

Methods:

Students completed a voluntary short answer question paper during their otolaryngology attachment. Short answer question paper results were collated and compared to the essay examination and new end of year objective structured clinical examination.

Results:

The study comprised 160 students. Questions were validated for internal consistency (Cronbach's alpha = 0.76). Correlations were determined for: short answer question paper and essay results (r = 0.477), short answer question paper and objective structured clinical examination results (r = 0.355), and objective structured clinical examination and essay results (r = 0.292). On unpaired t-tests comparing the short answer question paper group and non-short answer question paper group, essay results were 1.2 marks higher (p = 0.45) and the objective structured clinical examination results were 0.09 marks lower (p = 0.74) in the short answer question paper group.

Conclusion:

Two new valid summative assessments of student ability have been introduced, which contribute to an enhanced programme of assessment to drive student learning.

Type
Main Articles
Copyright
Copyright © JLO (1984) Limited 2015 

Introduction

The assessment of medical students is essential to ensure competence and standards in medical education.Reference Epstein1 Assessment also has a crucial role in driving student learning via its content, format, timing and subsequent feedback.Reference Van Der Vleuten2 In order to fulfil these roles, it is important to develop reliable measurements of student performance.Reference Wass, Van der Vleuten, Shatzer and Jones3

Miller proposed a hierarchical framework of clinical assessment, ranging from evaluation of cognition to behaviour in practice.Reference Miller4 In the past 30 years, traditional paradigms of assessment have changed, with a variety of new methods of assessment emerging that test these different domains of competency. Best practice in medical education involves using a variety of assessment methods at all levels of the hierarchical model.Reference Schuwirth and van der Vleuten5 It has also been shown that continuous assessment can positively improve a students' learning experience, and it allows progress through a course to be monitored so that timely remediation may be taken where necessary.Reference Vergis and Hardy6, Reference Hayes, Holden, Gaynor, Kavanagh and Otoom7

It has been established that approximately 32–35 per cent of students train to become primary care providers,Reference Voorhees, Prado-Gutierrez, Epperly and Dirkson8, Reference Lambert, Goldacre, Smith and Goldacre9 with targets of up to 50 per cent in some countries.10 There have been longstanding concerns regarding a low priority assigned to ENT in the undergraduate curriculum.Reference Vinayak and Bates11 Conditions relating to ENT comprise 20–50 per cent of presenting complaints to a primary care provider; however, there is limited otolaryngology training in undergraduate medical education.Reference Clamp, Gunasekaran, Pothier and Saunders12Reference Donnelly, Quraishi and McShane15 Undergraduate ENT teaching has been combined with other specialties in more than half of medical schools in the UK, with approximately two weeks or less spent on an ENT placement, and often no specific assessment of ENT.Reference Clamp, Gunasekaran, Pothier and Saunders12, Reference Mace and Narula16, Reference Khan and Saeed17 It is therefore not surprising that 75 per cent of general practitioners feel that they would benefit from further training in ENT.Reference Clamp, Gunasekaran, Pothier and Saunders12, Reference Sharma, Machen, Clarke and Howard18

Students at our institution attend a compulsory ENT attachment lasting two weeks. Traditionally, students have been assessed solely via an annual written examination based on topics covered in the curriculum. In this study, we evaluated the introduction of a short answer question paper and objective structured clinical examination to the ENT undergraduate assessment.

Materials and methods

The ENT attachment at our institution aims to give students a comprehensive grounding in the basics of ENT. During their placement, students are given lectures by senior clinicians, and didactic and practical tutorials by a dedicated ENT tutor. Students also spend time in out-patient clinics, an operating theatre, a dedicated ENT emergency department and ENT wards. In addition, there are tutorials provided by audiological and speech and language specialists. An attendance sheet is given to students, which must be signed by a clinician after each session. The ENT examination results were previously included as a small percentage of the overall final surgical grade; however, it is now a separate module.

We carried out a prospective, comparative, quasi-experimental study to evaluate the introduction of two new assessments as part of the course. The short answer question paper was introduced as an aspect of continuous assessment at the end of the two-week attachment. It lasted 30 minutes and consisted of 10 questions based on a picture, an example of which is shown in Figure 1. Each paper was randomly chosen from a bank of 30 questions, blueprinted from the curriculum and approved by the departmental professor to ensure content validity. It was shown to be reliable by testing for internal consistency (Cronbach's alpha average = 0.76). The paper was voluntary, and students who wished to take this paper and participate in the study provided written informed consent. Papers were scored by a single examiner blinded to student identifiers. Feedback on results was given to students.

Fig. 1 Example of question from short answer question paper.

The results of the short answer question paper were collated and plotted against the end of year objective structured clinical examination and written essay paper. Unpaired t-tests were performed to compare results from students who took the short answer question paper to results from students who did not take the short answer question paper. A p value of less than 0.05 was considered significant.

An ENT objective structured clinical examination station was introduced to test practical skills at the end of the student year. Otoscopy and tuning fork tests were examined on a model head with a slide of a perforated tympanic membrane inserted into the auditory canal. The examination lasted 6 minutes. The scoring sheet is shown in Figure 2. Four examiners took part in scoring the students. Results were collated and plotted against the end of year written examination and the short answer question paper results.

Fig. 2 Objective structured clinical examination marking sheet.

Students sat an end of year written essay examination, blueprinted from curricular topics, which comprised the majority of the overall score for the module. This was scored by two separate examiners who were blinded to student identifiers. It consisted of five essay questions based on clinical scenarios and was designed to assess clinical reasoning. The examination lasted one hour.

Repeat students were excluded from the analysis of results in this study. The study was undertaken following approval by the university research ethics committee.

Results

There were 166 students in the year group. Of these, six were excluded from the analysis in this study as they were repeat students from the previous year. Attendance rates during the clinical attachment were 81 per cent.

The voluntary short answer question paper was completed by 83 students. The average score was 76 per cent, with a range of 43–97 per cent. A total of 160 students took part in the objective structured clinical examination. The average score was 78 per cent, with a range of 5–100 per cent. A total of 160 students completed the essay paper. The average score was 61 per cent, with a range of 30–88 per cent.

Short answer question paper results were plotted against essay paper results to test for predictive validity. This demonstrated a medium strength positive correlation, with a Pearson correlation coefficient of r = 0.477 (Figure 3). Objective structured clinical examination results were plotted against short answer question paper results to test for predictive validity. This again demonstrated a medium strength positive correlation, with a Pearson correlation coefficient of r = 0.355 (Figure 4). Objective structured clinical examination results were plotted against essay paper results to test for concurrent validity. This demonstrated a weak positive correlation, with a Pearson correlation coefficient of r = 0.292 (Figure 5).

Fig. 3 Scatter plot of short answer question (SAQ) paper results against essay paper results.

Fig. 4 Scatter plot of objective structured clinical examination (OSCE) results against short answer question (SAQ) paper results.

Fig. 5 Scatter plot of objective structured clinical examination (OSCE) results against essay paper results.

Unpaired t-tests were performed to compare the group of students who took the short answer question paper against the group of students who did not take the short answer question paper. There was a mean improvement of 1.2 marks in the essay paper for the group of students who took the short answer question paper; however, this was not statistically significant (p = 0.45). The score in the objective structured clinical examination was 0.09 marks lower in the group of students who took the short answer question paper; however, this was not statistically significant (p = 0.74).

Discussion

The principle of triangulation is now recognised as important in the assessment of professional competence, such that contemporary best practice is moving towards designing programmes of assessment rather than individual assessments.Reference Wass, Bowden, Jackson, Jackson, Jameson and Khan19 It is unlikely that one method will assess all domains of competency and so a variety of assessment methods are required, with the result that the shortcomings of one can be overcome by the advantages of another.Reference Al-Wardy20 Utilising a variety of judges, instruments and contexts also ensures both validity and reliability,Reference van der Vleuten and Schuwirth21 which is an ever increasing concern for medical schools in view of increasing pressure of litigation and the need to be able to defend assessment decisions.Reference Hawthorne, Jackson, Jameson and Khan22

This study analysed the introduction of new assessments to our ENT course. Examination content was deemed satisfactory by reference to the curriculum and approval from the head of department, and internal consistency of the short answer question paper questions was determined. This study demonstrates that both the short answer question paper and objective structured clinical examination are valid tests of student knowledge and ability in the topics covered, as the results were positively correlated when referenced to the essay paper and to each other. This indicates predictive and concurrent validity. It is interesting to note a lower positive correlation between the objective structured clinical examination and essay examination. This finding concurs with results elsewhere,Reference Raj, Badcock, Brown, Deighton and O'Reilly23 and is likely to be a result of the disparity in the skills being assessed.Reference Chandratilake, Davis and Ponnamperuma24 However, an objective structured clinical examination is accepted as a reliable and valid measure of objective assessment.Reference Newble25, Reference Harden and Gleeson26

These examinations have introduced the elements of both continuous assessment and assessment of ENT practical skills to enhance our undergraduate ENT course. ‘Open form’ or ‘context-rich’ assessments are most suitable for evaluating the application of knowledge and higher level abilities.Reference Munro, Denney, Rughani, Foulkes, Wilson and Tate27, Reference Schuwirth and van der Vleuten28 Multiple levels of Miller's hierarchical framework of assessmentReference Miller4 are now included in our course. Knowledge and factual recognition is evaluated using a short answer question paper. Integrated knowledge and capacity for clinical context application is assessed via an essay paper of clinical scenarios. Finally, performance and practical skills are evaluated using an objective structured clinical examination. While it is established that essay and short answer question papers are resource intensive in comparison to machine-markable assessments,Reference Munro, Denney, Rughani, Foulkes, Wilson and Tate27 the advantage of a dedicated ENT tutor at our institution facilitates the use of these written assessment methods. To save on resources, other institutions may prefer to develop tests of knowledge based on multiple-choice questions.

Although we have shown the short answer question paper to be a valid test of student ability, the results indicate that sitting the short answer question paper does not significantly alter the results of the essay paper or objective structured clinical examination. As an intervention itself, the short answer question paper may not improve student ability in completing an essay paper or objective structured clinical examination in a formative sense, perhaps because they assess different domains of competency. However, the short answer question paper can still act as a valid and useful summative assessment, by testing knowledge in a different capacity and by introducing an element of continuous assessment to the course, and thus contribute to an overall programme of assessment. Indeed, subjective feedback from a focus group of randomly invited students at the end of the year suggested that those who completed the short answer question paper felt that their engagement in the course improved, which has been similarly noted elsewhere, despite no quantifiable effect on student learning being determined.Reference Evans, Zeun and Stanier29 Face validity was achieved in that students also considered the short answer question paper a fair test of their abilities and felt it could reasonably be incorporated into the overall score for the module.

  • There is widespread concern regarding the inadequacy of ENT teaching at undergraduate level

  • Improvement in assessment of ENT can drive improved learning

  • New, additional methods of assessment covering a range of competencies were shown to be valid and reliable

  • Participation in a short answer question paper did not improve results in other forms of examination, but was valid for continuous assessment

  • A thorough medical course should include lectures, clinical placement, continuous assessment and end of year examination

There was potential for bias in this study in a number of ways. As the short answer question paper was voluntary, it was possible that more dedicated or diligent students would have chosen to complete the paper, and thus the results could have been skewed in favour of the short answer question paper group. However, this was not evident, and in fact the average score in the objective structured clinical examination was lower in the short answer question paper group, although this finding was not significant. Alternatively, students who were weak in the area or unsure of their abilities may have opted to take the paper in order to test their knowledge. Some students may have been advantaged by having their two-week placement close to the end of year examinations and thus may have been more incentivised to engage in the learning process. They may also have heard questions from previous groups. Despite established marking schemes, there was potential for examiner bias in the short answer question paper and objective structured clinical examination, but the essay papers were reviewed by two examiners to ensure reliability.

Construct validity of the short answer question paper by pre- and post-intervention scoring was not measured, but we have shown the test to be valid through criterion reference to two other examinations, and by content in relation to the curriculum, and thus consider the test as valid for the purposes of inclusion in the summative assessment.

Based on recent statistics on ENT teaching at undergraduate level, there is clearly a demand and need for more ENT undergraduate teaching.Reference Lloyd, Tan, Taube and Doshi30 This is important in order to catch up with other specialties,Reference Powell, Cooles, Carrie and Paleri31 particularly given the cross-relevance of ENT to many postgraduate disciplines. Furthermore, improved undergraduate ENT courses are likely to encourage higher calibre students into the specialty.Reference Ranta, Hussain and Gardiner32 As assessment drives learning,Reference Fowell and Bligh33 an important aspect of improving ENT at undergraduate level involves incorporating a variety of valid and appropriate assessment methods.

Conclusion

We have introduced two valid and reliable assessments to our ENT course. The short answer question paper did not improve results in other assessments; however, it introduces continuous assessment and acts as an assessment of basic knowledge, which provides important information to students regarding their performance. The objective structured clinical examination allows a practical assessment of clinical skills. These assessments have enhanced the undergraduate ENT course at our institution by driving learning through a programme of validated assessment methods across different domains of competency.

Acknowledgements

We would like to thank Frances Colgan and Michele Keane from the Dublin University School of Medicine for their assistance with this study.

Footnotes

Presented at the Irish Otolaryngology Society annual meeting, 11 October 2014, Donegal, Ireland.

References

1Epstein, RM. Assessment in medical education. N Engl J Med 2007;356:387–96CrossRefGoogle ScholarPubMed
2Van Der Vleuten, CP. The assessment of professional competence: developments, research and practical implications. Adv Health Sci Educ Theory Pract 1996;1:4167CrossRefGoogle ScholarPubMed
3Wass, V, Van der Vleuten, C, Shatzer, J, Jones, R. Assessment of clinical competence. Lancet 2001;357:945–9CrossRefGoogle ScholarPubMed
4Miller, GE. The assessment of clinical skills/competence/performance. Acad Med 1990;65(9 suppl):S63–7CrossRefGoogle ScholarPubMed
5Schuwirth, LW, van der Vleuten, CP. ABC of learning and teaching in medicine: written assessment. BMJ 2003;326:643–5CrossRefGoogle Scholar
6Vergis, A, Hardy, K. Principles of assessment: a primer for medical educators in the clinical years. Internet Journal of Medical Education 2009;1(1)Google Scholar
7Hayes, A, Holden, C, Gaynor, D, Kavanagh, B, Otoom, S. Bridging the gap: a program to enhance medical students’ learning experience in the foundation year. Bahrain Medical Bulletin 2013;35:201–5CrossRefGoogle Scholar
8Voorhees, KI, Prado-Gutierrez, A, Epperly, T, Dirkson, D. A proposal for reform of the structure and financing of primary care graduate medical education. Fam Med 2013;45:164–70Google ScholarPubMed
9Lambert, T, Goldacre, R, Smith, F, Goldacre, MJ. Reasons why doctors choose or reject careers in general practice: national surveys. Br J Gen Pract 2012;62:e851–8CrossRefGoogle ScholarPubMed
10Department of Health. A High Quality Workforce: NHS Next Stage Review. London: HMSO, 2008Google Scholar
11Vinayak, BC, Bates, GJ. Undergraduate training in ENT - time for change? J R Soc Med 1993;86:181Google ScholarPubMed
12Clamp, PJ, Gunasekaran, S, Pothier, DD, Saunders, MW. ENT in general practice: training, experience and referral rates. J Laryngol Otol 2007;121:580–3CrossRefGoogle ScholarPubMed
13Hu, A, Sardesai, MG, Meyer, TK. A need for otolaryngology education among primary care providers. Med Educ Online 2012;17:17350CrossRefGoogle ScholarPubMed
14Donnelly, MJ, Walsh, MA, Hone, S, O'Sullivan, P. Examining the ear: clinical teaching. Med Educ 1996;30:299302CrossRefGoogle ScholarPubMed
15Donnelly, MJ, Quraishi, MS, McShane, DP. ENT and general practice: a study of paediatric ENT problems seen in general practice and recommendations for general practitioner training in ENT in Ireland. Ir J Med Sci 1995;164:209–11CrossRefGoogle ScholarPubMed
16Mace, AD, Narula, AA. Survey of current undergraduate otolaryngology training in the United Kingdom. J Laryngol Otol 2004;118:217–20CrossRefGoogle ScholarPubMed
17Khan, MM, Saeed, SR. Provision of undergraduate otorhinolaryngology teaching within General Medical Council approved UK medical schools: what is current practice? J Laryngol Otol 2012;126:340–4CrossRefGoogle ScholarPubMed
18Sharma, A, Machen, K, Clarke, B, Howard, D. Is undergraduate otorhinolaryngology teaching relevant to junior doctors working in accident and emergency departments? J Laryngol Otol 2006;120:949–51CrossRefGoogle ScholarPubMed
19Wass, V, Bowden, R, Jackson, N. The principles of assessment design. In: Jackson, N, Jameson, A, Khan, A, eds. Assessment in Medical Education and Training: A Practical Guide. Oxford: Radcliffe, 2007;1126Google Scholar
20Al-Wardy, NM. Assessment methods in undergraduate medical education. Sultan Qaboos Univ Med J 2010;10:203–9Google ScholarPubMed
21van der Vleuten, CP, Schuwirth, LW. Assessing professional competence: from methods to programmes. Med Educ 2005;39:309–17CrossRefGoogle ScholarPubMed
22Hawthorne, K. Assessment in the undergraduate curriculum. In: Jackson, N, Jameson, A, Khan, A, eds. Assessment in Medical Education and Training: A Practical Guide. Oxford: Radcliffe, 2007;2740Google Scholar
23Raj, N, Badcock, LJ, Brown, GA, Deighton, CM, O'Reilly, SC. Design and validation of 2 objective structured clinical examination stations to assess core undergraduate examination skills of the hand and knee. J Rheumatol 2007;34:421–4Google ScholarPubMed
24Chandratilake, M, Davis, M, Ponnamperuma, G. Evaluating and designing assessments for medical education: the utility formula. Internet Journal of Medical Education 2009;1(1)Google Scholar
25Newble, D. Techniques for measuring clinical competence: objective structured clinical examinations. Med Educ 2004;38:199203CrossRefGoogle ScholarPubMed
26Harden, RM, Gleeson, FA. Assessment of clinical competence using an objective structured clinical examination (OSCE). Med Educ 1979;13:4154CrossRefGoogle ScholarPubMed
27Munro, N, Denney, ML, Rughani, A, Foulkes, J, Wilson, A, Tate, P. Ensuring reliability in UK written tests of general practice: the MRCGP examination 1998–2003. Med Teach 2005;27:3745CrossRefGoogle ScholarPubMed
28Schuwirth, LW, van der Vleuten, CP. Different written assessment methods: what can be said about their strengths and weaknesses? Med Educ 2004;38:974–9CrossRefGoogle ScholarPubMed
29Evans, DJ, Zeun, P, Stanier, RA. Motivating student learning using a formative assessment journey. J Anat 2014;224:296303CrossRefGoogle ScholarPubMed
30Lloyd, S, Tan, ZE, Taube, MA, Doshi, J. Development of an ENT undergraduate curriculum using a Delphi survey. Clin Otolaryngol 2014;39:281–8CrossRefGoogle ScholarPubMed
31Powell, J, Cooles, FA, Carrie, S, Paleri, V. Is undergraduate medical education working for ENT surgery? A survey of UK medical school graduates. J Laryngol Otol 2011;125:896905CrossRefGoogle ScholarPubMed
32Ranta, M, Hussain, SS, Gardiner, Q. Factors that inform the career choice of medical students: implications for otolaryngology. J Laryngol Otol 2002;116:839–41CrossRefGoogle ScholarPubMed
33Fowell, SL, Bligh, JG. Recent developments in assessing medical students. Postgrad Med J 1998;74:1824CrossRefGoogle ScholarPubMed
Figure 0

Fig. 1 Example of question from short answer question paper.

Figure 1

Fig. 2 Objective structured clinical examination marking sheet.

Figure 2

Fig. 3 Scatter plot of short answer question (SAQ) paper results against essay paper results.

Figure 3

Fig. 4 Scatter plot of objective structured clinical examination (OSCE) results against short answer question (SAQ) paper results.

Figure 4

Fig. 5 Scatter plot of objective structured clinical examination (OSCE) results against essay paper results.