Item analysis of type “A” multiple choice questions of biochemistry in GIM module, year 1 MBBS program
DOI:
https://doi.org/10.18203/2320-6012.ijrms20250957Keywords:
Difficulty index, Discrimination index, Distractor efficiency, Item analysis, Type A MCQ, KR20, One best answerAbstract
Background: Medical education is continually evolving to meet the demands of healthcare and scientific advancements. The Defence Services Medical Academy (DSMA) in Myanmar implemented an outcome-based curriculum in 2017. The genetics, immunology, and molecular medicine (GIM) module is a critical component of the first-year MBBS program, laying the foundation for knowledge in biochemistry and related sciences.
Methods: This study analyzed the quality of type “A” multiple choice questions (MCQs)- 28 items from the Biochemistry section of the GIM module’s end-module assessment. The evaluation focused on four key indices: difficulty index (P), discrimination index (D), distractor efficiency (DE), and Kuder-Richardson formula 20 (KR-20) for reliability.
Results: Of the 28 items, 21 (75%) were classified as having average difficulty, while 5 (18%) were deemed too difficult. Regarding discrimination, 12 (43%) items displayed very good discrimination, while 8 (29%) had poor discrimination, indicating the need for revision. Nearly 70% of the MCQs had fully functional distractors. Overall, the Biochemistry questions showed moderate reliability (KR-20 score =0.682), with three items (11%) recommended for rejection due to poor performance.
Conclusions: These findings highlight the necessity of regular item analysis and revision to ensure the quality and fairness of assessments. Faculty development and active learning strategies are essential to improving the overall reliability and effectiveness of MCQs in medical education.
Metrics
References
Sjöström H, Christensen L, Nystrup J, Karle H. Quality assurance of medical education: lessons learned from use and analysis of the WFME global standards. Med Teach. 2019;41(6):650-5. DOI: https://doi.org/10.1080/0142159X.2018.1536259
Al-Rukban MO. Guidelines for the construction of multiple choice questions tests. J Fam Community Med. 2006;13(3):125-33. DOI: https://doi.org/10.4103/2230-8229.97543
Black P, Wiliam D. Assessment and classroom learning. Education. 2003;20:123-33.
Collins J. Education techniques for lifelong learning: writing multiple-choice questions for continuing medical education activities and self-assessment modules. Radiogr Rev Publ Radiol Soc N Am Inc. 2006;26(2):543-51. DOI: https://doi.org/10.1148/rg.262055145
El-Uri FI, Malas N. Analysis of use of a single best answer format in an undergraduate medical examination. Qatar Med J. 2013;2013(1). DOI: https://doi.org/10.5339/qmj.2013.1
Fowell, Southgate, Bligh. Evaluating assessment: the missing link? Med Educ. 1999;33(4):276-81. DOI: https://doi.org/10.1046/j.1365-2923.1999.00405.x
Hingorjo MR, Jaleel F. Analysis of one-best MCQs: the difficulty index, discrimination index and distractor efficiency. J Pak Med Assoc. 2012;62(2):142.
Sim SM, Rasiah RI. Relationship between item difficulty and discrimination indices in true/false-type multiple choice questions of a para-clinical multidisciplinary paper. Ann Acad Med Singapore. 2006;35(2):67. DOI: https://doi.org/10.47102/annals-acadmedsg.V35N2p67
Shaibani TA, Ali F, Deifalla AH, Jaradat A. Item analysis of type" A" multiple choice questions from a multidisciplinary units assessment in a problem based curriculum. Bahrain Med Bull. 2021;43(2).
Carroll RG. Evaluation of vignette-type examination items for testing medical physiology. Adv Physiol Educ. 1993;264(6):S11. DOI: https://doi.org/10.1152/advances.1993.264.6.S11
Brennan RL. Educational Measurement. Rowman & Littlefield; 2023.
Kuder GF, Richardson MW. The theory of the estimation of test reliability. Psychometrika. 1937;2(3):151-60. DOI: https://doi.org/10.1007/BF02288391
Considine J, Botti M, Thomas S. Design, format, validity and reliability of multiple choice questions for use in nursing research and education. Collegian. 2005;12(1):19-24. DOI: https://doi.org/10.1016/S1322-7696(08)60478-3
Varu M, Vegad A, Shah C, Mehta H, Kacha Y. Attendance, attitudes and academic performance: a study on first year MBBS students attending physiology classes. Int J Med Sci Educ. 2016;3(1):31-7.
Rasool QI, Bin SF, Farooq SU, Gani M. Exploring the correlation between attendance and academic performance in physiology among phase 1st MBBS students: a comprehensive study. Indian J Clin Anat Physiol. 2024;10(4):231-5. DOI: https://doi.org/10.18231/j.ijcap.2023.053
Patil RP, Bahekar SE, Kulkarni MD, Baig MS. Evaluation of validity and reliability of multiple-choice questions in second MBBS competency-based medical education-based pharmacology examination of medical institute of India. Int J Res Med Sci. 2022;10(12):2878. DOI: https://doi.org/10.18203/2320-6012.ijrms20223091
Tarrant M, Ware J, Mohammed AM. An assessment of functioning and non-functioning distractors in multiple-choice questions: a descriptive analysis. BMC Med Educ. 2009;9(1):40. DOI: https://doi.org/10.1186/1472-6920-9-40
Vyas R, Supe A. Multiple choice questions: a literature review on the optimal number of options. Nat Med J India. 2008;21(3):130-3.
Velou MS, Ahila E. Refine the multiple-choice questions tool with item analysis. Int Arch Integr Med. 2020;7(8):80-5.
Brady AM. Assessment of learning with multiple-choice questions. Nurse Educ Pract. 2005;5(4):238-42. DOI: https://doi.org/10.1016/j.nepr.2004.12.005
Sood R, Singh T. Assessment in medical education: Evolving perspectives and contemporary trends. Nat Med J India. 2012;25(6):357-64. DOI: https://doi.org/10.5005/jp/books/11647_3