Introduction
The assessment of medical students is essential to ensure competence and standards in medical education.Reference Epstein1 Assessment also has a crucial role in driving student learning via its content, format, timing and subsequent feedback.Reference Van Der Vleuten2 In order to fulfil these roles, it is important to develop reliable measurements of student performance.Reference Wass, Van der Vleuten, Shatzer and Jones3
Miller proposed a hierarchical framework of clinical assessment, ranging from evaluation of cognition to behaviour in practice.Reference Miller4 In the past 30 years, traditional paradigms of assessment have changed, with a variety of new methods of assessment emerging that test these different domains of competency. Best practice in medical education involves using a variety of assessment methods at all levels of the hierarchical model.Reference Schuwirth and van der Vleuten5 It has also been shown that continuous assessment can positively improve a students' learning experience, and it allows progress through a course to be monitored so that timely remediation may be taken where necessary.Reference Vergis and Hardy6, Reference Hayes, Holden, Gaynor, Kavanagh and Otoom7
It has been established that approximately 32–35 per cent of students train to become primary care providers,Reference Voorhees, Prado-Gutierrez, Epperly and Dirkson8, Reference Lambert, Goldacre, Smith and Goldacre9 with targets of up to 50 per cent in some countries.10 There have been longstanding concerns regarding a low priority assigned to ENT in the undergraduate curriculum.Reference Vinayak and Bates11 Conditions relating to ENT comprise 20–50 per cent of presenting complaints to a primary care provider; however, there is limited otolaryngology training in undergraduate medical education.Reference Clamp, Gunasekaran, Pothier and Saunders12–Reference Donnelly, Quraishi and McShane15 Undergraduate ENT teaching has been combined with other specialties in more than half of medical schools in the UK, with approximately two weeks or less spent on an ENT placement, and often no specific assessment of ENT.Reference Clamp, Gunasekaran, Pothier and Saunders12, Reference Mace and Narula16, Reference Khan and Saeed17 It is therefore not surprising that 75 per cent of general practitioners feel that they would benefit from further training in ENT.Reference Clamp, Gunasekaran, Pothier and Saunders12, Reference Sharma, Machen, Clarke and Howard18
Students at our institution attend a compulsory ENT attachment lasting two weeks. Traditionally, students have been assessed solely via an annual written examination based on topics covered in the curriculum. In this study, we evaluated the introduction of a short answer question paper and objective structured clinical examination to the ENT undergraduate assessment.
Materials and methods
The ENT attachment at our institution aims to give students a comprehensive grounding in the basics of ENT. During their placement, students are given lectures by senior clinicians, and didactic and practical tutorials by a dedicated ENT tutor. Students also spend time in out-patient clinics, an operating theatre, a dedicated ENT emergency department and ENT wards. In addition, there are tutorials provided by audiological and speech and language specialists. An attendance sheet is given to students, which must be signed by a clinician after each session. The ENT examination results were previously included as a small percentage of the overall final surgical grade; however, it is now a separate module.
We carried out a prospective, comparative, quasi-experimental study to evaluate the introduction of two new assessments as part of the course. The short answer question paper was introduced as an aspect of continuous assessment at the end of the two-week attachment. It lasted 30 minutes and consisted of 10 questions based on a picture, an example of which is shown in Figure 1. Each paper was randomly chosen from a bank of 30 questions, blueprinted from the curriculum and approved by the departmental professor to ensure content validity. It was shown to be reliable by testing for internal consistency (Cronbach's alpha average = 0.76). The paper was voluntary, and students who wished to take this paper and participate in the study provided written informed consent. Papers were scored by a single examiner blinded to student identifiers. Feedback on results was given to students.
Fig. 1 Example of question from short answer question paper.
The results of the short answer question paper were collated and plotted against the end of year objective structured clinical examination and written essay paper. Unpaired t-tests were performed to compare results from students who took the short answer question paper to results from students who did not take the short answer question paper. A p value of less than 0.05 was considered significant.
An ENT objective structured clinical examination station was introduced to test practical skills at the end of the student year. Otoscopy and tuning fork tests were examined on a model head with a slide of a perforated tympanic membrane inserted into the auditory canal. The examination lasted 6 minutes. The scoring sheet is shown in Figure 2. Four examiners took part in scoring the students. Results were collated and plotted against the end of year written examination and the short answer question paper results.
Fig. 2 Objective structured clinical examination marking sheet.
Students sat an end of year written essay examination, blueprinted from curricular topics, which comprised the majority of the overall score for the module. This was scored by two separate examiners who were blinded to student identifiers. It consisted of five essay questions based on clinical scenarios and was designed to assess clinical reasoning. The examination lasted one hour.
Repeat students were excluded from the analysis of results in this study. The study was undertaken following approval by the university research ethics committee.
Results
There were 166 students in the year group. Of these, six were excluded from the analysis in this study as they were repeat students from the previous year. Attendance rates during the clinical attachment were 81 per cent.
The voluntary short answer question paper was completed by 83 students. The average score was 76 per cent, with a range of 43–97 per cent. A total of 160 students took part in the objective structured clinical examination. The average score was 78 per cent, with a range of 5–100 per cent. A total of 160 students completed the essay paper. The average score was 61 per cent, with a range of 30–88 per cent.
Short answer question paper results were plotted against essay paper results to test for predictive validity. This demonstrated a medium strength positive correlation, with a Pearson correlation coefficient of r = 0.477 (Figure 3). Objective structured clinical examination results were plotted against short answer question paper results to test for predictive validity. This again demonstrated a medium strength positive correlation, with a Pearson correlation coefficient of r = 0.355 (Figure 4). Objective structured clinical examination results were plotted against essay paper results to test for concurrent validity. This demonstrated a weak positive correlation, with a Pearson correlation coefficient of r = 0.292 (Figure 5).
Fig. 3 Scatter plot of short answer question (SAQ) paper results against essay paper results.
Fig. 4 Scatter plot of objective structured clinical examination (OSCE) results against short answer question (SAQ) paper results.
Fig. 5 Scatter plot of objective structured clinical examination (OSCE) results against essay paper results.
Unpaired t-tests were performed to compare the group of students who took the short answer question paper against the group of students who did not take the short answer question paper. There was a mean improvement of 1.2 marks in the essay paper for the group of students who took the short answer question paper; however, this was not statistically significant (p = 0.45). The score in the objective structured clinical examination was 0.09 marks lower in the group of students who took the short answer question paper; however, this was not statistically significant (p = 0.74).
Discussion
The principle of triangulation is now recognised as important in the assessment of professional competence, such that contemporary best practice is moving towards designing programmes of assessment rather than individual assessments.Reference Wass, Bowden, Jackson, Jackson, Jameson and Khan19 It is unlikely that one method will assess all domains of competency and so a variety of assessment methods are required, with the result that the shortcomings of one can be overcome by the advantages of another.Reference Al-Wardy20 Utilising a variety of judges, instruments and contexts also ensures both validity and reliability,Reference van der Vleuten and Schuwirth21 which is an ever increasing concern for medical schools in view of increasing pressure of litigation and the need to be able to defend assessment decisions.Reference Hawthorne, Jackson, Jameson and Khan22
This study analysed the introduction of new assessments to our ENT course. Examination content was deemed satisfactory by reference to the curriculum and approval from the head of department, and internal consistency of the short answer question paper questions was determined. This study demonstrates that both the short answer question paper and objective structured clinical examination are valid tests of student knowledge and ability in the topics covered, as the results were positively correlated when referenced to the essay paper and to each other. This indicates predictive and concurrent validity. It is interesting to note a lower positive correlation between the objective structured clinical examination and essay examination. This finding concurs with results elsewhere,Reference Raj, Badcock, Brown, Deighton and O'Reilly23 and is likely to be a result of the disparity in the skills being assessed.Reference Chandratilake, Davis and Ponnamperuma24 However, an objective structured clinical examination is accepted as a reliable and valid measure of objective assessment.Reference Newble25, Reference Harden and Gleeson26
These examinations have introduced the elements of both continuous assessment and assessment of ENT practical skills to enhance our undergraduate ENT course. ‘Open form’ or ‘context-rich’ assessments are most suitable for evaluating the application of knowledge and higher level abilities.Reference Munro, Denney, Rughani, Foulkes, Wilson and Tate27, Reference Schuwirth and van der Vleuten28 Multiple levels of Miller's hierarchical framework of assessmentReference Miller4 are now included in our course. Knowledge and factual recognition is evaluated using a short answer question paper. Integrated knowledge and capacity for clinical context application is assessed via an essay paper of clinical scenarios. Finally, performance and practical skills are evaluated using an objective structured clinical examination. While it is established that essay and short answer question papers are resource intensive in comparison to machine-markable assessments,Reference Munro, Denney, Rughani, Foulkes, Wilson and Tate27 the advantage of a dedicated ENT tutor at our institution facilitates the use of these written assessment methods. To save on resources, other institutions may prefer to develop tests of knowledge based on multiple-choice questions.
Although we have shown the short answer question paper to be a valid test of student ability, the results indicate that sitting the short answer question paper does not significantly alter the results of the essay paper or objective structured clinical examination. As an intervention itself, the short answer question paper may not improve student ability in completing an essay paper or objective structured clinical examination in a formative sense, perhaps because they assess different domains of competency. However, the short answer question paper can still act as a valid and useful summative assessment, by testing knowledge in a different capacity and by introducing an element of continuous assessment to the course, and thus contribute to an overall programme of assessment. Indeed, subjective feedback from a focus group of randomly invited students at the end of the year suggested that those who completed the short answer question paper felt that their engagement in the course improved, which has been similarly noted elsewhere, despite no quantifiable effect on student learning being determined.Reference Evans, Zeun and Stanier29 Face validity was achieved in that students also considered the short answer question paper a fair test of their abilities and felt it could reasonably be incorporated into the overall score for the module.
• There is widespread concern regarding the inadequacy of ENT teaching at undergraduate level
• Improvement in assessment of ENT can drive improved learning
• New, additional methods of assessment covering a range of competencies were shown to be valid and reliable
• Participation in a short answer question paper did not improve results in other forms of examination, but was valid for continuous assessment
• A thorough medical course should include lectures, clinical placement, continuous assessment and end of year examination
There was potential for bias in this study in a number of ways. As the short answer question paper was voluntary, it was possible that more dedicated or diligent students would have chosen to complete the paper, and thus the results could have been skewed in favour of the short answer question paper group. However, this was not evident, and in fact the average score in the objective structured clinical examination was lower in the short answer question paper group, although this finding was not significant. Alternatively, students who were weak in the area or unsure of their abilities may have opted to take the paper in order to test their knowledge. Some students may have been advantaged by having their two-week placement close to the end of year examinations and thus may have been more incentivised to engage in the learning process. They may also have heard questions from previous groups. Despite established marking schemes, there was potential for examiner bias in the short answer question paper and objective structured clinical examination, but the essay papers were reviewed by two examiners to ensure reliability.
Construct validity of the short answer question paper by pre- and post-intervention scoring was not measured, but we have shown the test to be valid through criterion reference to two other examinations, and by content in relation to the curriculum, and thus consider the test as valid for the purposes of inclusion in the summative assessment.
Based on recent statistics on ENT teaching at undergraduate level, there is clearly a demand and need for more ENT undergraduate teaching.Reference Lloyd, Tan, Taube and Doshi30 This is important in order to catch up with other specialties,Reference Powell, Cooles, Carrie and Paleri31 particularly given the cross-relevance of ENT to many postgraduate disciplines. Furthermore, improved undergraduate ENT courses are likely to encourage higher calibre students into the specialty.Reference Ranta, Hussain and Gardiner32 As assessment drives learning,Reference Fowell and Bligh33 an important aspect of improving ENT at undergraduate level involves incorporating a variety of valid and appropriate assessment methods.
Conclusion
We have introduced two valid and reliable assessments to our ENT course. The short answer question paper did not improve results in other assessments; however, it introduces continuous assessment and acts as an assessment of basic knowledge, which provides important information to students regarding their performance. The objective structured clinical examination allows a practical assessment of clinical skills. These assessments have enhanced the undergraduate ENT course at our institution by driving learning through a programme of validated assessment methods across different domains of competency.
Acknowledgements
We would like to thank Frances Colgan and Michele Keane from the Dublin University School of Medicine for their assistance with this study.