Impact of distractors in item analysis of multiple choice questions

Authors

  • Ismail Burud Department of Surgery, International Medical University, Jalan Dr Muthu, Bukit Rasah, 70300 Seremban, Negeri Sembilan, Malaysia
  • Kavitha Nagandla Department of Obstetrics and Gyanecology, International Medical University, Jalan Dr Muthu, Bukit Rasah, 70300 Seremban, Negeri Sembilan, Malaysia
  • Puneet Agarwal Department of Surgery, International Medical University, Jalan Dr Muthu, Bukit Rasah, 70300 Seremban, Negeri Sembilan, Malaysia

DOI:

https://doi.org/10.18203/2320-6012.ijrms20191313

Keywords:

Distractor efficiency, Item analysis, One best answer

Abstract

Background: Item analysis is a quality assurance of examining the performance of the individual test items that measures the validity and reliability of exams. This study was performed to evaluate the quality of the test items with respect to their performance on difficulty index (DFI), Discriminatory index (DI) and assessment of functional and non-functional distractors (FD and NFD).

Methods: This study was performed on the summative examination undertaken by 113 students. The analyses include 120 one best answers (OBAs) and 360 distractors.

Results: Out of the 360 distractors, 85 distractors were chosen by less than 5% with the distractor efficiency of 23.6%. About 47 (13%) items had no NFDs while 51 (14%), 30 (8.3%), and 4 (1.1%) items contained 1, 2, and 3 NFDs respectively. Majority of the items showed excellent difficulty index (50.4%, n=42) and fair discrimination (37%, n=33). The questions with excellent difficulty index and discriminatory index showed statistical significance with 1NFD and 2 NFD (p=0.03).

Conclusions: The post evaluation of item performance in any exam in one of the quality assurance method of identifying the best performing item for quality question bank. The distractor efficiency gives information on the overall quality of item.

References

Carneson J, Delpierre G, Masters K. Designing and managing MCQs: Appendix C: MCQs and Bloom's taxonomy. 2011. Available at: URL: http://web.uct.ac.za/projects/cbe/mcqman/mcqappc.html. Accessed 2 February 2011.

Case SM, Swanson DB. Constructing written test questions for the basic and clinical sciences, National Board of Medical Examiners 3rd ed. 2010. Available at: URL: http://www.nbme.org/publications/item-writing-manual.html. Accessed 5 February 2011.

Hotiu A. The relationship between item difficulty and discrimination indices in multiple-choice tests in a Physical science course. Boca Raton, Florida: Florida Atlantic University; 2006. Available at: http://www.physics.fau.edu/research/education/A.Hotiu_thesis.pdf. Accessed 14 June 2015.

Cizek GJ, O'Day DM. Further investigation of nonfunctioning options in multiple-choice test items. Educational Psychol Measurement. 1994 Dec;54(4):861-72.

Saudi Commission for Health Specialties. Item writing manual for multiple-choice questions. 2015. Available at: http://www.scfhs.org.sa/education/HighEduExams/Guidlines/mcq/Documents/MCQ%20Manual.pdf. Accessed 12 June 2015

Quaigrain K, Arhin AK. Using reliability and item analysis to evaluate a teacher-developed test in educational measurement and evaluation. Cogent Education. 2017 Jan 1;4(1):1301013.

Considine J, Botti M, Thomas S. Design, format, validity and reliability of multiple choice questions for use in nursing research and education. Collegian. 2005 Jan 1;12(1):19-24.

Kheyami D, Jaradat A, Al-Shibani T, Ali FA. Item Analysis of Multiple Choice Questions at the Department of Paediatrics, Arabian Gulf University, Manama, Bahrain. Sultan Qaboos University Med J. 2018;18(1):e68-e74.

Hingorjo MR1, Jaleel F. Analysis of one-best MCQs: the difficulty index, discrimination index and distractor efficiency. J Pak Med Assoc. 2012 Feb;62(2):142-7.

Linn RL, Miller MD. Measurement and assessment in teaching. Upper Saddle River, New Jersey: Pearson Education Inc. 2005;348-65.11.

Tarrant M, Ware J, Mohammed AM. An assessment of functioning and non- functioning distractors in multiple choice questions: a descriptive analysis. BMC Med Edu. 2009;9(40):2-8.

Ebel RL, Frisbie DA. Essentials of educational measurements. 5th ed. Englewood Clipps, NJ: Prentice Hall; 1991.

Vyas R, Supe A. Multiple choice questions: a literature review on the optimal number of options. Natl Med J India. 2008;21:130-3.

Gyata Mehta, Varsha Mokhasi. Item Analysis of Multiple Choice Questions- An Assessment of the Assessment Tool. Int J Health Sci Res. 2014;4:197-202.

Virendra C Patil, Harsha V Patil. Item Analysis of Medicine Multiple Choice Questions (MCQs) for Under Graduate (3rd year MBBS) Students. Res J Pharmaceut Biol Chem Sci. 2015;6:1242-51.

Gajjar S, Sharma R, Kumar P, Rana M. Item and test analysis to identify quality multiple choice questions (MCQs) from an assessment of medical students of Ahmedabad, Gujarat. Indian J Comm Med. 2014 Jan;39(1):17-20.

Tarrant M, Ware J, Mohammed AM. An assessment of functioning and non-functioning distractors in multiple-choice questions: a descriptive analysis. BMC Med Edu. 2009 Dec;9(1):40.

Hingorjo MR, Jaleel F. Analysis of one-best MCQs: the difficulty index, discrimination index and distractor efficiency. J Pakistan Med Assoc. 2012 Feb 1;62(2):142-7.

Downloads

Published

2019-03-27

How to Cite

Burud, I., Nagandla, K., & Agarwal, P. (2019). Impact of distractors in item analysis of multiple choice questions. International Journal of Research in Medical Sciences, 7(4), 1136–1139. https://doi.org/10.18203/2320-6012.ijrms20191313

Issue

Section

Original Research Articles