Analysis of multiple choice questions from a formative assessment of medical students of a medical college in Delhi, India

Authors

  • Richa Garg Department of Pharmacology, Dr. Baba Saheb Ambedkar Medical College & Hospital, Delhi, India
  • Vikas Kumar Department of Community Medicine, Dr. Baba Saheb Ambedkar Medical College & Hospital, Delhi, India
  • Jyoti Maria Department of Pharmacology, Dr. Baba Saheb Ambedkar Medical College & Hospital, Delhi, India

DOI:

https://doi.org/10.18203/2320-6012.ijrms20185375

Keywords:

Assessment, Difficulty index, Discrimination index, Distractor efficiency

Abstract

Background: Assessment is a dominant motivator to direct and drive students learning. Different methods of assessment are used to assess medical knowledge in undergraduate medical education. Multiple choice questions (MCQs) are being used increasingly due to their higher reliability, validity, and ease of scoring. Item analysis enables identifying good MCQs based on difficulty index (DIF I), discrimination index (DI), and distracter efficiency (DE).

Methods: Students of second year MBBS appeared in a formative assessment test, that was comprised of 50 “One best response type” MCQs of 50 marks without negative marking. All MCQs were having single stem with four options including, one being correct answer and other three incorrect alternatives (distracter). Three question paper sets were prepared by disorganizing sequence of questions. One of the three paper sets was given to each student to avoid copying from neighboring students. Total 50 MCQs and 150 distracters were analyzed and indices like DIF I, DI, and DE were calculated.

Results: Total Score of 87 students ranged from 17 to 48 (out of total 50). Mean for difficulty index (DIF I) (%) was 71.6+19.4. 28% MCQs were average and “recommended” (DIF I 30-70%). Mean for discrimination index (DI) was 0.3+0.17. 16% MCQs were “good” and 50% MCQs were in “excellent” criteria, while rests of the MCQs were “discard/poor” according to DI criteria. Mean for distracter efficiency (DE) (%) was 63.4+33.3. 90% of the items were having DE from 100 to 33%. It was found that MCQs with lower difficulty index (<70) were having higher distracter efficiency (93.8% vs. 6.2%, p=0.004).

Conclusions: Item analysis provided necessary data for improvement in question formulation and helped in revising and improving the quality of items and test also. Questions having lower difficulty index (<70) were significantly associated with higher discrimination index (>0.15) and higher distractor efficiency.

References

Mehta M, Banode S, Adwal S. Analysis of multiple choice questions (MCQ): important part of assessment of medical students. Int J Med Res Rev. 2016;4(2).

Case S, Swanson D. Constructing written test questions for the basic and clinical sciences. 3rd ed. Philadelphia: National Board of Medical Examiners; 2003.

Tarrant M, Ware J. A framework for improving the quality of multiple-choice assessments. Nurse Educ. 2012;37(3):98-104.

Gajjar S, Sharma R, Kumar P, Rana M. Item and test analysis to identify quality multiple choice questions (MCQs) from an assessment of medical students of Ahmedabad, Gujarat. Indian journal of community medicine: official publication of Indian Association of Preventive and Social Medicine. 2014 Jan;39(1):17.

University of Washington. Office of Educational Assessment. Understanding Item Analyses. Available at: http://www.washington.edu/assessment/scanning-scoring/scoring/reports/item-analysis/. Accessed 20 December 2018.

Singh T, Gupta P, Singh D. Principles of Medical Education. 3rd ed. New Delhi: Jaypee Brothers Medical Publishers (P) Ltd. Test and item analysis; 2009:70.

Matlock-Hetzel S. Basic concept in item and test analysis. Presented at annual meeting of the Southwest Educational Research Association, Austin, 1997. Available at: www.ericae.net/ft/tamu/Espy.htm.

Eaves S, Erford B. The Gale group. The purpose of item analysis, item difficulty, discrimination index, characteristic curve. Available at: www.education.com/reference/article/itemanalysis/

Sarin YK, Khurana M, Natu MV, Thomas AG, Singh T. Item analysis of published MCQs. Indian Pediatr. 1998;35:1103-5.

Tarrant M, Ware J, Mohammed AM. An assessment of functioning and non-functioning distractors in multiple-choice questions: A descriptive analysis. BMC Med Educ. 2009;9:1-8.

Scantron Guides-Item Analysis, adapted from Michigan State University website and Barbara gross devils tools for teaching. Available at: www.freepdfdb.com/pdf/item-analysis-scantron.

Hingorjo MR, Jaleel F. Analysis of one-best MCQs: the difficulty index, discrimination index and distractor efficiency. J Pak Med Assoc. 2012 Feb;62(2):142-7.

Cizek GJ, O’Day DM: Further investigations of nonfunctioning options in multiple choice test items. Educ psycho meas. 1994;54(4):861-72.

Patel R. Use of item analysis to improve quality of multiple choice questions in II MBBS. J Edu Technol Heal Sci. 2017;4(1):22-9.

Saxena S, Srivastava P, Mallick A, Joshi H, Das B. Item analysis: An unmatched tool for validating MCQs in Medical Education. Ind J Basic Appl Med Res. 2016;5(4):263-9.

Kolte V. Item analysis of multiple choice questions in physiology examination. Indian J Basic Appl Med Res. 2015;4(4):320-6.

Mehta G, Mokhasi V. Item analysis of multiple choice questions-an assessment of the assessment tool. Int J Health Sci Res. 2014;4(7):197-202.

Downloads

Published

2018-12-26

How to Cite

Garg, R., Kumar, V., & Maria, J. (2018). Analysis of multiple choice questions from a formative assessment of medical students of a medical college in Delhi, India. International Journal of Research in Medical Sciences, 7(1), 174–177. https://doi.org/10.18203/2320-6012.ijrms20185375

Issue

Section

Original Research Articles