DOI: http://dx.doi.org/10.18203/2320-6012.ijrms20171817

A descriptive analysis of extended matching questions among third year medical students

Sehlule Vuma, Bidyadhar Sa

Abstract


Background: With changes in teaching methods in medicine, assessment tools have also evolved in order to be valid, reliable, practical, analyzable and not time-consuming. After questions of reliability and practicality when using free response short answer questions, we replaced them with extended matching questions (EMQs). Previous analysis of the same group of students, in the same time period, showed high reliability and discrimination with standard multiple choice questions (MCQ). Objective was to describe the efficiency of Extended matching questions (EMQ) in third-year medicine courses

Methods: Castler-Rock Integrity programme, item analyzed reports of EMQ results over a three-year period were analyzed. There were 25EMQ items in each course, each year, with 9 option answers.

Results: The Kuder Richardson-20 reliability mean ranged from 0.447 to 0.674. Spearman-Brown split half-reliability coefficient mean ranged from 0.443 to 0.685. Spearman-Brown prophecy reliability formula mean from 0.614 to 0 837. The Guttman split-half reliability coefficient mean ranged from 0.441 to 0.718. The difficulty mean ranged from 0.491 to 0.719. The Corrected point bi-serial coefficient ratio mean was 0.118 to 0.255. The number of items with all-functioning distractors ranged from 16% to 40%, and the total number of non-functioning distractors ranged from 14.5% to 28%.

Conclusions: EMQs showed reliability, though lower than with the MCQs previously analyzed. This may be due to the much smaller numbers hence increasing numbers of EMQs should be considered. There was a high number of functioning distractors. Poor distractors should be revised.


Keywords


Difficulty, Discrimination, Distractor, EMQ, Reliability

Full Text:

PDF

References


Case SM, Swanson DB. Extended matching items: a practical alternative to free-response questions. Teaching and Learning in Medicine. 1993;5:107-15.

Azer SA. Assessment in problem-based learning. Biochem Mol Biol Educ. 2003;31(6):428-34.

DiBattista D, Kurzawa L. Examination of the quality of multiple-choice items on classroom tests. Canadian J Scholarship Teaching and Learning. 2011;2(2):4.

Palmer EJ, Devitt PG. Assessment of higher order cognitive skills in undergraduate education: modified essay or multiple choice questions? Research Paper. BioMed Central Medical Education 2007;7:49

Campbel DE. How to write good multiple-choice questions. J Paediatr Child Health. 2011;47:322-5.

Case SM, Swanson DB. Constructing written test questions for the basic and clinical sciences. Third edition (Revised) National Board of Medical Examiners, 3750 Market Street, Philadelphia: PA; 2002:19104.

Beullens J, Damme BV, Jaspaert JJ, Janseen PJ. Are extended-matching multiple-choice items appropriate for a final test in medical education? Medical Teacher. 2002;24(4):390-5.

Wood EJ. What are extended matching sets questions? BEE-J. 2003;1(1):1-9.

Epstein RM. Assessment in medical education. N Engl J Med. 2007;356:387-6.

Case SM, Swanson DB, Ripkey DR. Comparison of items in five-option and extended-matching formats for assessment of diagnostic skills. Acad Med. 1994;69(10):S1-3.

Fenderson BA, Damjanov I, Robeson MR, Veloski JJ, Rubin E. The virtues of extended matching and uncued tests as alternatives to multiple choice questions. Hum Pathol. 1997;28:526-32.

Kreiter CD, Ferguson K, Gruppen LD. Evaluating the usefulness of computerized adaptive testing for medical in-course assessment. Acad Med. 1999;74:1125-8.

Wass V, Mcgibbon D, van der Vleuten C. Composite undergraduate clinical examinations: how should the components be combined to maximize reliability? Med Educ. 2001;35:326-30.

Samuels A. Extended matching questions and the Royal Australian and New Zealand College of Psychiatrists written examination: an overview. Australas Psychiatr. 2006;14(1):63-6.

Duthie S, Fiander A, Hodges A. EMQs: a new component of the MRCOG Part 1 examination. Obstetric Gynaecol. 2007;9:189-94.

Medical Council of Canada. February 2010. Guidelines for the Development of Multiple-Choice Questions. (http:// mcc.ca/ wp-content/ uploads/ Multiple-choice-question-guidelines.pdf).

Namdeo SK, Sahoo S. Item analysis of multiple choice questions from an assessment of medical students in Bhubaneswar, India. Int J Res Med Sci. 2016;4:1716-9.

Vuma S, Sa B. A comparison of clinical-scenario (case cluster) versus stand-alone multiple choice questions in a problem based learning environment, in undergraduate medicine. J Taibah Univ Med Sci. 2017;12(1):14-26.

Pearson’s r Correlation. (Accessed September 26, 2015). Available at: http:// faculty.quinnipiac.edu/ libarts/posci/Statistics.html

KR-(20). (Accessed September 26, 2015). Available at: http://eacvisualdata.com/eacs/kr20.aspx.

Thompson NA. (Accessed August 21, 2015). KR-20. Available at: http:// knowledge.sagepub.com/ view/researchdesign/n205.xml

Kuder and Richardson Formula 20. (Accessed September 26, 2015). Available at: http://www.real-statistics.com/reliability/kuder-richardson-formula-20/

Vuma S, Sa B, Ramsewak S. Descriptive analysis of pre-testing outcome in haematology as an indicator of performance in final examinations among third year medical students. Caribbean Teaching Scholar. 2015;5:1:25-35.

Vuma S, Sa B, Ramsewak S. A retrospective co-relational analysis of students’ performance in different modalities of assessment in Haematology and the final integrated multi-specialty examinations among third year MBBS students. Caribbean Teaching Scholar. 2015;5:1:37-46.

Morrison S, Walsh K. Writing multiple-choice test items that promote and measure critical thinking. J Nurs Educ. 2001;401.

Ware J, Vik T. Quality assurance of item writing: During the introduction of multiple choice questions in medicine for high stakes examinations. Med Teach. 2009;31(3):238-43.

Tarrant M, Ware J, Mohammed AM. An assessment of functioning and non-functioning distracters in multiple-choice questions: a descriptive analysis. BMC Medical Education. 2009;9:40.

Considine J, Botti M. Design, format, validity and reliability of multiple choice questions for use in nursing research and education. Collegian. 2015;12:1.

Haladyna TM, Downing SM. Validity of a taxonomy of multiple-Choice Item-Writing Rules. Applied Measurement in Education. 1989;2(1):51-78.

Hift RJ. Should essays and other open-ended- type questions retain a place in written summative assessment in clinical medicine? BMC Medical Education. 2014;14:249.

Swanson DB, Holtzman KZ, Allbee K. Measurement characteristics of content-parallel single-best-answer and extended-matching questions in relation to number and source of options. J Assoc Am Coll. 2008;83(10):S21-4.