Item analysis of multiple-choice questions in summative assessment for professional examination I of an outcome-based integrated MBBS curriculum

Authors

  • Kyaw Zwar Htoon Department of Pharmacology, Defence Services Medical Academy, Yangon, Myanmar
  • Ye Phyo Aung Department of Medical Education, Defence Services Medical Academy, Yangon, Myanmar

DOI:

https://doi.org/10.18203/2320-6012.ijrms20241226

Keywords:

Difficulty index, Discrimination index, Item, Professional Examination 1

Abstract

Background: This study presents an item analysis of multiple-choice questions (MCQ-Type A) used in the summative assessment for Professional Examination I at Defence Services Medical Academy, Yangon, Myanmar. The objectives of the study were to perform item analysis using Difficulty Index (DIF I) and Discrimination Index (DI) and to correlate between DIF I and DI.

Methods: A cross-sectional observational study was conducted with 200 multiple-choice questions from two written examination papers answered by 46 medical year 2 students of Defence Services Medical Academy, Yangon, Myanmar. Item analysis of multiple-choice questions were done by using DIF I and DI calculated post-exam.

Results: Results showed that the majority of items were categorized as easy based on DIF I, with 63% and 60% in Papers I and II, respectively. Only about one-third of items were deemed acceptable, and few fell into the difficult category. DI ranged from negative to excellent, with 62% and 61% of MCQs in Papers I and II showing acceptable to excellent discrimination. Items with poor discrimination (35% and 34% in Papers I and II) should be revised or discarded. Moreover, items with negative DI should be re-evaluated for potential key errors or vague wording. A low negative correlation between DIF I and DI was observed, indicating that as DIF I increased, discrimination power decreased. Notably, items with easy DIF I demonstrated a moderate negative correlation with DI, consistent with previous research.

Conclusions: This study underscores the importance of item analysis to enhance the validity of assessment tools and ensure the effective evaluation of student cognition levels. Consequently, reconstruction and modification of MCQs are recommended to improve assessment quality and accurately measure student abilities.

Metrics

Metrics Loading ...

References

Schuwirth LWT, van der Vleuten CPM. Written assessment. In: Dent JA, Harden RM, editors. A practical guide for medical teacher. 4th ed. Churchill Livingstone; 2013:299-306.

Jolly B. Written assessment. In: Swanwick T, editor. Understanding medical education: evidence, theory, and practice. 2nd ed. Wiley Blackwell; 2014:255-304.

Al-Rukban MO. Guidelines for construction of multiple choice questions tests. J Fami Commu Med. 2006;13(3):125-133.

Collins J. Writing multiple choice questions for continuing medical education activities and self-assessment modules. Radiogra. 2006;26(2):543-1.

Teli C, Kate N. Item analysis of multiple-choice questions in anatomy for first year MBBS. Natl J Physiol Pharm Pharmacol. 2022;12(3):1529-32.

Suryadevara VK, Bano Z. Item analysis to identify quality multiple choice questions/items in an assessment in Pharmacology of II MBBS students in Guntur Medical College of Andhra Pradesh, India. Int J Basic Clin Pharmacol. 2018;7(8):1517-21.

Namdeo S, Sahoo B. Item analysis of multiple choice questions from an assessment of medical students in Bhubaneswar, India. Int J Res Med Sci. 2016;1716-9.

Shete A, Kausar A, Lakhkar K, Khan S. Item analysis: An evaluation of multiple choice questions in physiology examination. J Contemp Med Educ. 2015;3(3):106-9.

Date AP, Borkar AS, Badwaik RT, Siddiqui RA, Shende TR, Dashputra AV. Item analysis as tool to validate multiple choice question bank in pharmacology. Int J Basic Clin Pharmacol. 2019;8(9):1999-2003.

Kausar A, Daimi S, Borulkar T. Assessing an assessment tool: Analysis of multiple choice questions on difficulty level and discrimination power, from an assessment in physiology. Natl J Physiol Pharm Pharmacol. 2022;(06):801-5.

Chauhan GR, Chauhan BR, Vaza J V, Chauhan PR. Relations of the number of functioning distractors with the item difficulty index and the item discrimination power in the multiple choice questions. Cureus. 2023;15(7).

Al Ameer AY. Assessment of the quality of multiple-choice questions in the surgery course for an integrated curriculum, University of Bisha College of Medicine, Saudi Arabia. Cureus. 2023;15(12).

McKimm J, Forrest K, Thistlethwaite J. Medical education at a glance. 1st ed. Wiley Blackwell; 2017.

Baste V. Item analysis of MCQs in physiology and its correlation with faculty’s perception of difficulty level of MCQs. Natl J Physiol Pharm Pharmacol. 2023;13(7):1444-7.

Yahia AIO. Post-validation item analysis to assess the validity and reliability of multiple-choice questions at a medical college with an innovative curriculum. Natl Med J India. 2021;34(6):359-62.

Gajjar S, Sharma R, Kumar P, Rana M. Item and test analysis to identify quality multiple choice questions (MCQs) from an assessment of medical students of Ahmedabad, Gujarat. Ind J Commu Med. 2014;39(1):17-20.

Downloads

Published

2024-04-30

How to Cite

Htoon, K. Z., & Aung, Y. P. (2024). Item analysis of multiple-choice questions in summative assessment for professional examination I of an outcome-based integrated MBBS curriculum. International Journal of Research in Medical Sciences, 12(5), 1451–1456. https://doi.org/10.18203/2320-6012.ijrms20241226

Issue

Section

Original Research Articles