Item Analysis as a Tool for Educational Assessment as Compared to Students, Evaluation to lectures

Authors

  • K Mohammed Sowdani *Department of of Pharm. Chem., College of Pharmacy, Uni. Al-Mustansyriah
  • Huda Jaber ** Department of Clinical Lab. Sciences, College of Pharmacy, Uni. Al-Mustansyriah.
  • Ara Murad Thomas ** Department of Clinical Lab. Sciences, College of Pharmacy, Uni. Al-Mustansyriah.

DOI:

https://doi.org/10.32947/ajps.v18i2.484

Keywords:

Item Analysis, Facility Value, Discrimination Index, Reliability

Abstract

Item analysis is an effective method for assessing not only the test but can also revel important issues of the whole educational process from curriculum design, implementation, assessment and evaluation. The method is suitable for MCQ type questions. After checking the reliability of the test, each question, item, is assessed using the criteria of Facility Value (FV) which represents the ratio of correct answers of each item and the discrimination index (DI) which takes into account the number of students in the upper quartile and that in lower quartile who answered the item correctly. An unacceptable question falls beyond the educationally acceptable limits of,

0.15 < FV < 0.85 and DI > 0.1.

Unacceptable questions were identified and used to assess the teaching process of this subject material and as indicators of lecturers' performance, from the number of unacceptable questions and the average of FV and DI. The smaller the number of unacceptable questions, the better the teaching process of the subject material under study and the higher the performance of the lecture will be.

Method: Random (using six-sided die) and systematic (cumulatively adding the quotient of population to sample size) sampling methods were used and the reliability of the MCQ part of the test as well essay type was checked and found to be satisfactory with R2 > 0.7 (the closer to unity the stronger the relationship and the higher the reliability). Results: For the subject material understudy, Pharmaceutical Organic Chemistry, for lecturer A, the first 13, items out of the 25 MCQ questions tested were found to satisfy the conditions set for FV and DI of acceptable questions. Conclusion: Acceptable items were identified and rated for further improvement in the stem or the distractors especially those near the border limits. For further improvement, item distractors need to be analyzed in detail. This method is effective in quantitatively rating lecturers' abilities in setting effective questions in relation to teaching objectives, the smaller the number of unacceptable items the better the performance of the lecturer. Unacceptable questions can be excluded or subject to future revision. 

Downloads

Published

2018-12-01

How to Cite

Sowdani, K. M., Jaber, H., & Thomas, A. M. (2018). Item Analysis as a Tool for Educational Assessment as Compared to Students, Evaluation to lectures. Al Mustansiriyah Journal of Pharmaceutical Sciences, 18(2), 105–113. https://doi.org/10.32947/ajps.v18i2.484

Most read articles by the same author(s)