Evaluation of undergraduate ophthalmology medical students' methods of assessment during COVID-19 pandemic

Document Type : Original Article

Author

Ophthalmology, Faculty of Medicine of boys, Al-Azhar university, Cairo, Egypt

Abstract

ABSTRACT
Background: Covid-19 pandemic negatively impacted traditional education systems.
Aim of the work: Evaluation of undergraduate ophthalmology medical students' assessment methods in absence of face to face oral exams (OE) and live on-patient practical exams (PE) in Al-Azhar Faculty of Medicine of Boys during COVID-19 pandemic (Cairo,2021).
Participants and Methods: 230 students responded to online survey. Objective evaluation was also done by analyzing the total marks achieved by each student.
Results: Formative exams (FE) helped students for better achievement in summative written exams and Objective structured clinical exam (OSCE). As regards OSCE; data-show questions covered the assessment of most studied ophthalmology signs with high quality images. Variation of questions in written exam including MCQ, short answer questions (SAQ), structured limited essay questions (EQ) and case scenario (CS) based questions allowed assessment of most course topics precisely and simply, and could replace OE, and that was safer than face to face OE during Covid-19 pandemic. 97.8 % of students succeeded to pass ophthalmology module exams with 44.2% of them achieved A+ grade.
Conclusion: It is recommended to apply FE frequently after each chapter. The use of variable forms of examinations and questions can replace face to face OE and live on-patient PE for undergraduate ophthalmology students. Students should practice answering scenario based questions.
Keywords: Assessment of medical students; Formative assessment; Objective structured clinical examination; Oral exam; Single best answer questions; Short answer questions; Structured essay questions; Scenario based questions; COVID-19 pandemic

Keywords


INTRODUCTION

Most educators agree that there are 2 main types of assessment criteria, formative examinations (FE) which occur continuously during the educational process and provide systematic continuous feed-back at all stages of the educational situations, 1 and summative assessment which occurs at the end of the learning process resulting in a ranking, a mark or a degree.2

Multiple choice questions (MCQ) can assess factual recall and problem-solving.3 They are often used to test student’s knowledge of a broad range of content and are preferred for their objectivity and ease of scoring a large bulk of students at a time. Correct framing of MCQ is thus essential.4 Concerns have been raised about MCQ limitations. These include cueing. Question construction is also a concern; both poor grammar and question structure reduce test validity.5 Critics of MCQ examination have also raised  concerns  about  whether  such  questions  can assess the higher cognitive levels and the clinical skills needed for professional success.6

Short answer questions (SAQ) are most often used to test basic knowledge of key facts and terms. SAQ can also be used to test higher thinking skills, including analysis or evaluation.7 SAQ can be constructed faster than MCQ.6 Unlike MCQ, SAQ make it difficult for students to guess the answer. SAQ provide students with more flexibility to explain their understanding and demonstrate creativity than they would have with MCQ; this also means that scoring is relatively laborious and can be quite subjective.8 SAQ provide more structure than essay questions (EQ) and thus are often easy and faster to mark and often test a broader range of the course content than full EQ.7

Like SAQ, EQ provide students with an opportunity to explain their understanding and demonstrate creativity.8 EQ make it hard for students to arrive at an acceptable answer by bluffing. EQ can be constructed reasonably quickly and easily but the use of only essay questions restricts the overall scope of an exam to a few topics or areas. Marking these questions can be time-consuming and grader agreement can be difficult.9

Case scenario based multiple choice questions (CS-MCQ) provide opportunities for the integration of sub-specialties and assessment in keeping with problem based learning.10 CS-MCQ are reliable and practical and promote multi-logical and critical thinking.11

Oral exams (OE) allow students to respond directly to the instructor’s questions and/or to present prepared statements. These exams are especially popular in language courses that demand ‘speaking’ but they can be used to assess understanding in almost any course by following the guidelines for the composition of SAQ.12 Advantages of OE include that they provide nearly immediate feedback and so allow the student to learn as they are tested. There are drawbacks to OE including the amount of time required and the problem of record-keeping.13

Corona virus disease 2019 (COVID-19) lockdown had negatively impacted the methods of face to face learning and practical on-patient methods of students' assessment.14 COVID-19 forced a transformation from traditional face-to-face learning to electronic learning,15 and that this transformation will bring long-lasting effects on teaching, learning, assessment procedures and methods.16 So, the aim of this work is to evaluate the methods of undergraduate ophthalmology students' assessment during the integrated ophthalmology learning program of Al-Azhar Faculty of Medicine of Boys (Cairo,2021) by using variable forms of examinations and items (questions) in absence of face to face OE and on-patient practical examinations (PE) during the COVID-19 pandemic. And to answer the question: Can using of variable forms of assessment tools including objective structured clinical examinations (OSCE), MCQ, SAQ, EQ and CS based questions replace OE and on-patient PE for undergraduate ophthalmology medical students?

MATERIALS AND METHODS

230 third level students responded to survey of 24 Likert Scale questionnaires (Q) using Google platform to evaluate the methods used for students' assessment after completing 6 weeks of integrated ophthalmology course (from 16th March, 2021 to 22nd April, 2021). The students' assessment during the course included three online FE (Two single best answer (SBA) exams and one modified OSCE), work sheet activity, summative SBA exam and OSCE at the end of the course, and final summative written exam at the end of semester including single best answer questions (SBAQ), SAQ, structured EQ and CS based questions. The FE were done using Google platforms, and the summative OSCE and written exams were done under complete protective and social distancing situations due to COVID-19 partial lockdown in Egypt. The total marks were 100 and the students' assessment did not include OE or live practical on-patient exams.

The analysis of students' response was based on using five degrees points from 1 (strongly disagree) to 5 (strongly agree). Students were assured of their confidentiality and were promised that their names would not appear in the document.

The evaluation of methods of undergraduate ophthalmology students' assessment was also done by analyzing the total marks achieved by each student at the end of semester. 

Statistical package for the social sciences (SPSS) program version 24 (IBM corporation, New York) was used. Mode, mean (M) and standard deviation (SD) values were calculated, paired t-test was used to measure the statistical difference between students' response  from neutrality and probability (P) value was significant if P < 0.05.

RESULTS

The participant students responded to all Q (figures 1-24) and statistical data of students' responses (table 1) including mode, M, SD, t-test results and P value were analyzed.

61 % of students (20 % strongly agreed and 41 % agreed) reported that FE helped them for the following summative practical OSCE and written exams, while 28 % of students had neutral response and 11 % of them disagreed. In addition, 48 % of students agreed that three FE during the course were enough while 22 % of them disagreed.

Most students reported that the end semester examination paper was regular, all question sections were clearly printed and the marks distribution was clear. Most students preferred the use of MCQ with 4 options rather than MCQ with 5 options. Most students reported that the duration of summative OSCE was suitable. 46 % of students agreed that the duration of end-module SBA exam was suitable while 20 % of them disagreed. 47 % of students agreed that the duration of end-semester written exam was suitable while 23 % of them disagreed.

Most students reported that OSCE slides (figure 25) covered the assessment of most studied ophthalmology signs with high quality images. 49 % of students agreed that data-show dependent OSCE can replace live on-patient PE during corona pandemic while 19% of them disagreed. On asking about the possibility of replacing on-patient PE with OSCE for all times; 42 % of students agreed but the responses were not conclusive and were not statistically significant different from neutrality.

Most students reported that the variation of question forms allowed assessment of most course topics precisely and simply, so that student should review and study all chapters and topics preparing for the exam and they should not only focus on famous topics to pass the exam.

The majority of students (72 %) agreed that using variable forms of questions including MCQ, SAQ, structured limited EQ and CS based questions can always replace OE for undergraduate ophthalmology students' assessment and this was safer than face to face OE during corona pandemic. The majority of students (70 %) reported that using variable forms of written questions can save time more than OE. 68 % of students reported that marks can be more easily and correctly calculated with more justification by using variable forms of written questions than OE. 62 % of students reported that using variable forms of written questions are less suspected to human judging factors and variability of questions difficulty levels than OE. By asking about OE can give more self confidence and practical sense to the students, the responses were not conclusive or obviously different from neutrality.

 Most students agreed that CS based MCQ and SAQ can evaluate higher degrees of students thinking levels including application, evaluation and analysis. 44% of students agreed that CS based questions can help to replace live on-patient PE for undergraduate ophthalmology students during corona pandemic. The student responses were not statistically different from neutrality on asking if they prefer CS questions and if they have trained enough to answer these types of questions.

Analysis of total marks achieved by each student (table 2) showed that 97.8 % of students had passed the ophthalmology module examinations with 44.2 % of them had A+ values (≥ 90 marks).

 

Fig. 1: Students' responses for Q1 (FE helped me for the following summative OSCE and written exams.)

 

Fig. 2: Students' responses for Q2 (FE were enough in number during the course.)

 

Fig. 3: Students' responses for Q3 (The end-semester examination paper was regular, all question sections were clearly printed and the marks distribution was clear.)

 

Fig. 4: Students' responses for Q4 (I prefer MCQ with 4 options than MCQ with 5 options.)

 

Fig. 5: Students' responses for Q5 (The duration of summative OSCE was suitable.)

 

Fig. 6: Students' responses for Q6 (The duration of end-module written exam was suitable.) 

 

Fig. 7: Students' responses for Q7 (The duration of end-semester written exam was suitable.)

 

Fig. 8: Students' responses for Q8 (The summative OSCE covered the assessment of most studied ophthalmology signs with high quality images.)

 

Fig. 9: Students' responses for Q9 (Data-show images in OSCE can replace live patient dependent PE for undergraduate ophthalmology student assessment during corona pandemic.)

 

Fig. 10: Students' responses for Q10 (Data-show images in OSCE can always replace live patient dependent PE for undergraduate ophthalmology student assessment.)

 

Fig. 11: Students' responses for Q11 (The variation of question forms allows assessment of most course topics precisely and simply.) 

 

Fig 12: Students' responses for Q12 (With integrated ophthalmology course; I should read and study all topics before the exam and I should not focus on famous topics to pass the exam.)

 

Fig. 13: Students' responses for Q13 (Use of variable forms of questions like MCQ, SAQ, structured limited short EQ and CS based questions can replace OE.) 

 

Fig. 14: Students' responses for Q14 (Written assessment with variable forms of questions like MCQ, SAQ and structured limited EQ were safer than face to face OE during corona pandemic.)

 

Fig. 15: Students' responses for Q15 (Written assessment with variable forms of questions like MCQ, SAQ and structured limited EQ can save time more than OE.)

 

Fig. 16: Students' responses for Q16 (In written assessment including variable forms of questions; the marks can be easily and correctly calculated with more justification than OE.)

 

Fig. 17: Students' responses for Q17 (Written assessment using MCQ, SAQ and short EQ is less suspected to human judging factors and variability in questions difficulties than OE.)

 

Fig. 18: Students' responses for Q18 (OE provide the students with more self confidence.)

 

Fig. 19: Students' responses for Q19 (OE provide the students with more practical sense.)

 

Fig. 20: Students' responses for Q20 (CS based MCQ and SAQ can evaluate higher degrees of student thinking and cognitive levels like application, evaluation and analysis.)

 

Fig. 21: Students' responses for Q21 (CS based MCQ and SAQ can always replace live patient dependent PE and live history taking skills in ophthalmology.)

 

Fig. 22: Students' responses for Q22 (CS based MCQ and SAQ can replace live patient dependent PE in ophthalmology during corona pandemic.)

 

Fig. 23: Students' responses for Q23 (I prefer CS based questions than traditional direct questions.)

 

 

Fig. 24: Students' responses for Q24 (I have trained enough to answer CS based questions.)

 

Fig. 25: Examples for OSCE slides

 

 

Q

number

Mode

Of response

M ± SD

T value

P value

Significance

1

4

3.66 ± 1.01

6.55

P < 0.05

Significant

2

4

3.33 ± 1.03

3.22

P < 0.05

Significant

3

4

3.56 ± 1.03

5.45

P < 0.05

Significant

4

5

3.93 ± 1.10

8.43

P < 0.05

Significant

5

4

3.51 ± 1.05

4.81

P < 0.05

Significant

6

4

3.26 ± 1.04

2.50

P < 0.05

Significant

7

4

3.24 ± 1.12

2.14

P < 0.05

Significant

8

4

3.73 ± 0.95

7.67

P < 0.05

Significant

9

4

3.40 ± 1.06

3.61

P < 0.05

Significant

10

3

3.18 ± 1.14

1.58

P > 0.05

Not significant

11

4

3.43 ± 0.98

4.40

P < 0.05

Significant

12

4

3.46 ± 1.10

4.16

P < 0.05

Significant

13

5

3.96 ± 1.07

8.95

P < 0.05

Significant

14

5

4.03 ± 1.05

9.82

P < 0.05

Significant

15

5

3.97 ± 1.02

9.51

P < 0.05

Significant

16

5

3.94 ± 1.04

9.01

P < 0.05

Significant

17

4

3.71 ± 1.15

6.18

P < 0.05

Significant

18

3

2.83 ± 1.15

1.48

P > 0.05

Not significant

19

3

2.74 ± 1.14

2.28

P < 0.05

Significant

20

4

3.55 ± 0.95

5.81

P < 0.05

Significant

21

3

3.02 ± 1.06

0.19

P > 0.05

Not significant

22

3

3.30 ± 1.02

2.94

P < 0.05

Significant

23

3

3.14 ± 1.01

1.39

P > 0.05

Not significant

24

3

2.97 ± 1.11

0.46

P > 0.05

Not significant

Table 1: Statistical analysis of students' responses

 

Marks achieved (Total:100)

Percentage of students

Total marks  ≥ 90

44.2 %

85-89

21.8 %

80-84

14.8 %

75-79

8.00 %

70-74

4.80 %

65-69

3.00 %

60-65

1.20 %

< 60 (Failed)

1.20 %

< 60 (Failed) due to absence from one or more exam

1.00 %

Table 2:  Analysis of total marks achieved by each student

 

 

DISCUSSION

The present study aimed to evaluate the efficiency of undergraduate ophthalmology medical students' assessment methods during corona pandemic in Egypt. The strength points of the present study include that; this is the first type to be carried out including undergraduate ophthalmology medical students with integrated ophthalmology education program in the Egyptian educational institutes in general or in Al-Azhar University, in particular. The present study included relatively large number of students and included both subjective and objective tools for evaluation. Moreover, most of students' responses to Likert Scale questionnaires (79 %) were statistically significant different from neutrality. The limitations of the present study include that; the survey was applied only to ophthalmology curriculum and was concerned with the students' points of view which could be affected by cultural and individual preference factors. Further studies on other medical curricula and further surveys including supervisors and instructors opinions should be done.

The present study results are in agreement with other recent studies which concerned with analysis of different forms of questions and examination methods for undergraduate students.

The benefits of FE were also proved byOzan and Kincal(2018) who reported that application of FE increased the students' attitudes toward assessment and increased their achievement. 17 Jud et al.(2020) concluded that using MCQ accompanying every lecture had improved the students' performance in the final exam. 18 In the present study, only three electronic FE were presented to the students, so it is recommended to use more FE (following each chapter).   

Vegada et al. (2016) reported that five-options MCQ demand more energy, time and expertise.19Rahma et al.(2017) reported that assessment reliability is low when based on three options MCQ.20 All MCQ related to the present study were five-options.

  Budiyono (2019) concerned with analysis of non functioning distracters, item facility and item discrimination to compare between 5-options and 4-options MCQ, he concluded that teachers are free to choose either the 5-options or 4-options version; depending on the purpose of the test.21 The present study results revealed that students prefer MCQ with four options. So, further studies should be conducted to find out the suitable MCQ format for undergraduate ophthalmology students' assessment.

Sam et al.(2018) reported that valid assessment of undergraduate and postgraduate knowledge can be improved by the use of SAQ because SAQ can test nascent physician ability. 7 In the present study, written assessment included variable forms of questions and the end semester exam paper included SBAQ, SAQ, structured EQ and CS based questions to increase the reliability and validity of students' assessment.

The benefits of OE include better retention of concepts and better academic performance13. Absence of face to face OE and live on-patient PE were obligatory due to our integrated education program rules and COVID-19 national precautions, and that was compensated with using variable forms of exams and questions.

Disadvantages of online OE and other virtual exams include inability of educators to prevent cheating22, and there are concerns about the ability to verify user identity23. SO, online platforms were used only for FE in the present study and all summative exams were done under complete protective and social distancing measures in large computerized well aerated halls after dividing students into smaller subgroups.

Ellis et al.(2021) documented that social distancing regulations and lockdown measures resulted in many written and clinical examinations being cancelled during the initial surge of the virus, and the General Medical Council in UK has approved unprecedented changes to clinical examinations, including virtual assessment. 24 In the present study the practical assessment depended on OSCE using data-show cases presentation that assessed the ability of undergraduate to diagnose, differentiate investigate and manage ophthalmic signs.

CONCLUSION

FE is very important to prepare undergraduate medical students for better acceptance and achievement for summative PE and written assessment, and it is recommended to increase number and forms of FE during every course. Using of variable forms of exams and questions including OSCE, SBAQ, SAQ, structured limited EQ and CS based questions can replace face to face OE and live on-patient PE for undergraduate ophthalmology medical students during corona pandemic. Moreover, combination of different forms of questions in the exam paper allows adequate assessment of most course topics precisely and simply and encourages medical students to study all course topics before the exam. Scenario-based questions can evaluate higher levels of students' thinking and cognition, so it is recommended to train students enough for answering these forms of questions. Conducting further studies on different medical curricula concerning with new methods for students' assessment especially during corona pandemic is highly recommended. Finally, conducting interactive surveys for undergraduate medical students is suggested to be very important for improving learning and teaching goals.

REFERENCES
1. Ertle B, Rosenfeld D and Presser AL. Preparing preschool teachers to use and benefit from formative assessment: Assessment professional development system. ZDM Mathematics Education. 2016; 48: 977–89.
2.  Faulconer E, Griffith JC and Frank H. If at first you do not succeed: student behavior when provided feed forward with multiple trials for online summative assessments. Teaching in Higher Education. 2021; 26(4): 586-601.
3. Salih KE, Jibo A, and Ishaq M. Psychometric analysis of multiple-choice questions in an innovative curriculum in Kingdom of Saudi Arabia. J Family Med Prim Care. 2020; 9(7):3663-68.
4. Gupta P, Meena P, Khan AM, et al. Effect of faculty training on quality of multiple-choice questions. Int J Appl Basic Med Res. 2020; 10(3):210-14.
5. Cook AK, Lidbury JA, Creevy KE, et al. Multiple-choice questions in small animal medicine: an analysis of cognitive level and structural reliability, and the impact of these characteristics on student performance. J Vet Med Educ. 2020; 47(4):497-505.
6. Puthiaparampil T and Rahman M. Very short answer questions: a viable alternative to multiple choice questions. BMC Med Educ. 2020; 20:141.
7. Sam AH, Field SM, Collares CF, et al. Very-short-answer questions: reliability, discrimination and acceptability. Med Educ. 2018; 52(4):447-55.
8. Hameed S, Sam AH and Harris J. Validity of very short answer versus single best answer questions for undergraduate assessment. BMC Med Educ. 2016; 16: 266.
9. Westacott R, Gurnell M and Sam AH. Comparing single-best-answer and very-short-answer questions for the assessment of applied medical knowledge in 20 UK medical schools: Cross-sectional study. BMJ Open. 2019; 9(9): 1-7.
10. Bi M, Zhao Z, Yang J, et al. Comparison of case-based learning and traditional method in teaching postgraduate students of medical oncology. Med Teach. 2019; 41(10):1124-28.
11. Vuma S and Sa B. A comparison of clinical-scenario (case cluster) versus stand-alone multiple choice questions in a problem-based learning environment in undergraduate medicine. J Taibah Univ Med Sci. 2016; 11(1):14-26.
12. Pernar LIM, Askari R and Breen EM. Oral examinations in undergraduate medical education- what is the 'value added' to evaluation?. The American journal of surgery. 2020; 220(2): 328-33.
13. Hazen H and Hamann H. Assessing student learning in field courses using oral examsThe Geography Teacher. 2020; 17(4): 130-35.
14. Mishra, Nair D, Gopinathan A, et al. The impact of COVID-19 related lockdown on ophthalmology training programs in India – Outcomes of a survey. Indian Journal of Ophthalmology. 2020; 68(6): 999- 1004.
15. Cheong c, Coldwell-Neilson J, MacCallum K, et al. Covid-19 and education: Learning and teaching in a Pandemic-Constrained Environment. Informing Science Press. 2021; 20:443-63.
16. Khan RA and Jawaid M. Technology enhanced assessment (TEA) in COVID 19 Pandemic. Pak J Med Sci. 2020; 36(4):108–10. 
17. Ozan C and Kincal RY. The effects of formative assessment on academic achievement, attitudes toward the lesson, and self-regulation skills. Educational Sciences: Theory & Practice. 2018; 18: 85–118.
18. Jud SM, Cupisti S and Frobenius W. Introducing multiple-choice questions to promote learning for medical students: effect on exam performance in obstetrics and gynecology. Arch Gynecol Obstet. 2020; 302(6): 1401-06.
19. Vegada B, Shukla A, Khilnani A, et al. Comparison between three option, four option and five option multiple choice question tests for quality parameters: A randomized study. Indian J Pharmacol. 2016; 48(5):571-75.
20. Rahma NA, Shamad MM, Idris ME, et al. Comparison in the quality of distractors in three and four options type of multiple choice questions. Adv Med Educ Pract. 2017; 8:287-91.
21. Budiyono B. Five-option vs four-option multiple-choice questions. Indonesian Journal of English Teaching. 2019; 8(2):1-7.
22. Xiong Y and Suen HK. Assessment approaches in massive open online courses: Possibilities, challenges and future directions. International Review of Education. 2018; 64:241–63.
23. Baro X, Bernaus RM, Beneres D, et al. Biometric tools for learner identity in E-assessment. Engineering data-adaptive trust-based e-assessment system. 2020; 10: 41-65.
24. Ellis R, Oeppen RS and Brennan PA. Virtual postgraduate exams and assessments: the challenges of online delivery and optimising performance. Br J Oral Maxillofac Surg. 2021; 59(2): 233-37.