Open Access Open Access  Restricted Access Subscription Access
Open Access Open Access Open Access  Restricted Access Restricted Access Subscription Access

Study on the Correlation etween the Discrimination Index, B Facilitation Value and Distractor Efficiency of a Formative Assessment Useful Tools for OBE Practices


Affiliations
1 BMS Institute of Technology and Management, India
     

   Subscribe/Renew Journal


Different assessment tools are used to assess engineering knowledge and skills. Multiple choice questions (MCQs) and regular internal tests are commonly used techniques for assessing students. But often the imperfections in the framing of the question can affect students' results and impede the objective of evaluating their knowledge. In this article, the method of evaluating the quality of MCQs and internal test questions framed for PG students in a programming language was considered. For the analysis, Kelly's method was included by considering fast learning group and slow learning groups performances in both the methods of evaluation. Facilitation value, discrimination index, and distraction efficiency were estimated for the items attempted by the group of MCA students. The results obtained show that the item analysis provided necessary data for improvement in question formulation and helped in reviewing the quality of items and tests also. Questions having a lower difficulty index were significantly associated with a higher discrimination index and higher distractor efficiency.

Keywords

Assessment , Difficultyindex , Discrimination index, Distractor efficiency, Formative assessment.
Subscription Login to verify subscription
User
Notifications
Font Size


  • Anwar F.(2019). Activity-based teaching, student motivation and academic achievement. Journal of Education and Educational Development, 6(1), 154–170.
  • Burud I. Nagandla K. & Agarwal, P. (2019). Impact of distractors in item analysis of multiple choice questions. International Journal of Research in Medical Sciences, 7(4), 1136–1139.
  • Chalmers R. P. (2020). Partially and Fully non compensatory response models for dichotomous and polytomous items. Applied Psychological Measurement, 44(6), 415–430.
  • Collier D. R. & Gallagher T. L. (2020). Blogging in elementary classrooms: Mentoring teacher candidates' to use formative writing assessment and connect theory to practice. Teaching/Writing: The Journal of Writing Teacher Education:, 9, article 11.
  • Deena G. Raja K. Nizar Banu P. K. & Kannan K. (2020). Developing the assessment questions automatically to determine the cognitive level of the E-learner using NLP techniques. International Journal of Service Science, Management, Engineering and Technology, 11,16.
  • Ebel, R. L. (1979). Essentials of educational measurement (3rd ed). Prentice Hall.
  • K A Nguyen. (2020). C. Lucas D. Leadbeatter, Student generation and peer review of examination questions in the dental curriculum: Enhancing student engagement and learning. European Journal of Dental Education, 00,1–11.
  • Fozzard, N. Pearson A. du Toit E. Naug H. Wen W. & Peak I. R. (2018). Analysis of MCQ and distractor use in a large first year Health Faculty Foundation Program: Assessing the effects of changing from five to four options. BMC Medical Education, 18(1), 252.
  • Kaur M. Singla S. & Mahajan R. Item analysis of multiple choice questions in Pharmacology, International Journal of Applied and Basic Medical Research Published by Wolters Kluwer–Medknow.
  • Kelley T. L. (1939). The selection of upper and lower groups for the validation of test items. Journal of Educational Psychology, 30(1), 17–24.
  • Metsämuuronen ,& Jari.Generalized discrimination Index 2018.
  • Namdeo, S. K., & Rout, Sushi D. (2016). Assessment of functional and non-functional distracter in an item analysis. International Journal of Contemporary Medical Research ISSN: 2393-915X; (Print): 2454, 7379 | ICV: 50.43 | Volume 3 | Issue 7 | July, 1891–1893.
  • Pho V. Ligozat A. & Grau B. (2015, June 21–25). Distractor quality evaluation in multiple choice questions [Conference session]. International Conference on Artificial Intelligence in Education, Madrid, Spain.
  • Quaigrain K. & Arhin A. K. | (2017) Using reliability and item analysis to evaluate a teacher-developed test in educational measurement and evaluation. Cogent Education,4,1,1301013.
  • Ramirez-Mendoza R. Morales-Menendez R. Iqbal H. & Parra-Saldivar R.(2018). Engineering education 4.0: Proposal for a new curriculum. In The IEEE global engineering education conference, 2018 (pp.1273–1282).
  • Rehman A. Aslam A. & Hassan S. H. (April–June 2018). Item analysis of multiple choice questions, Pakistan Oral and Dental Journal,38(2) (pp. 291–293).
  • Sahoo D. P. & Singh R. Item and distracter analysis of multiple choice questions (MCQs) from a preliminary examination of undergraduate medical students. International Journal of Research in Medical Sciences, 5(12).
  • Serdar, S., Mustafa, H. B., & Mohammad, A. (2020). Computer based evaluation to assess students' learning for multiple-choice question-based exams:CBE -M CQs so ft ware tool. Computer Applications in Engineering Education, 1–15.
  • Tang T. Vezzani V. & Eriksson V. (2020). Developin g critical thinking , co llective creativity skills and problem solving through playful design jams. Thinking Skills and Creativity, 37.
  • Tarrant M. Ware J. & Mohammed A. M. (2009). An assessment of functioning and non-functioning distractors in multiple-choice questions: A descriptive analysis. BMC Medical Education, 9,40.

Abstract Views: 94

PDF Views: 0




  • Study on the Correlation etween the Discrimination Index, B Facilitation Value and Distractor Efficiency of a Formative Assessment Useful Tools for OBE Practices

Abstract Views: 94  |  PDF Views: 0

Authors

Arun Kumar B.R
BMS Institute of Technology and Management, India

Abstract


Different assessment tools are used to assess engineering knowledge and skills. Multiple choice questions (MCQs) and regular internal tests are commonly used techniques for assessing students. But often the imperfections in the framing of the question can affect students' results and impede the objective of evaluating their knowledge. In this article, the method of evaluating the quality of MCQs and internal test questions framed for PG students in a programming language was considered. For the analysis, Kelly's method was included by considering fast learning group and slow learning groups performances in both the methods of evaluation. Facilitation value, discrimination index, and distraction efficiency were estimated for the items attempted by the group of MCA students. The results obtained show that the item analysis provided necessary data for improvement in question formulation and helped in reviewing the quality of items and tests also. Questions having a lower difficulty index were significantly associated with a higher discrimination index and higher distractor efficiency.

Keywords


Assessment , Difficultyindex , Discrimination index, Distractor efficiency, Formative assessment.

References