Open Access Open Access  Restricted Access Subscription Access

Developing a Framework for Online Practice Examination and Automated Score Generation


Affiliations
1 Department of Computer Science & Engineering, Chittagong University of Engineering & Technology, Chittagong-4349, Bangladesh
 

Examination is the process by which the ability and the quality of the examinees can be measured. It is necessary to ensure the quality of the examinees. Online examination system is the process by which the participants can appear at the examination irrespective of their locations by connecting to examination site via Internet using desktop computers, laptops or smart phones. Automated score generation is the process by which the answer scripts of the examinations are evaluated automatically to generate scores. Although, there are many existing online examination systems, the main drawback of these systems is that they cannot compute automated score accurately, especially from the text-based answers. Moreover, most of them are unilingual in nature. As a result, examinees can appear at the examination in a particular language. Considering this fact, in this paper, we present a framework that can take Multiple Choice Questions (MCQ) examinations and written examinations in two different languages English and Bangla. We develop a database where the questions and answers are stored. The questions from the database are displayed in the web page with answering options for the MCQ questions and text boxes for the written questions. For generating the scores of the written questions, we performed several types of analysis of the answers of the written questions. However, for generating the scores of the MCQ questions, we simply compared between the database answers and the user’s answers. We conducted several experiments to check the accuracy of score generation by our system and found that our system can generate 100% accurate scores for MCQ questions and more than 90% accurate scores from text based questions.

Keywords

Multiple Choice Questions, Automated Scoring, Answer Analysis, Experimental Analysis.
User
Notifications
Font Size


  • Developing a Framework for Online Practice Examination and Automated Score Generation

Abstract Views: 451  |  PDF Views: 228

Authors

S. M. Saniul Islam Sani
Department of Computer Science & Engineering, Chittagong University of Engineering & Technology, Chittagong-4349, Bangladesh
Rezaul Karim
Department of Computer Science & Engineering, Chittagong University of Engineering & Technology, Chittagong-4349, Bangladesh
Mohammad Shamsul Arefin
Department of Computer Science & Engineering, Chittagong University of Engineering & Technology, Chittagong-4349, Bangladesh

Abstract


Examination is the process by which the ability and the quality of the examinees can be measured. It is necessary to ensure the quality of the examinees. Online examination system is the process by which the participants can appear at the examination irrespective of their locations by connecting to examination site via Internet using desktop computers, laptops or smart phones. Automated score generation is the process by which the answer scripts of the examinations are evaluated automatically to generate scores. Although, there are many existing online examination systems, the main drawback of these systems is that they cannot compute automated score accurately, especially from the text-based answers. Moreover, most of them are unilingual in nature. As a result, examinees can appear at the examination in a particular language. Considering this fact, in this paper, we present a framework that can take Multiple Choice Questions (MCQ) examinations and written examinations in two different languages English and Bangla. We develop a database where the questions and answers are stored. The questions from the database are displayed in the web page with answering options for the MCQ questions and text boxes for the written questions. For generating the scores of the written questions, we performed several types of analysis of the answers of the written questions. However, for generating the scores of the MCQ questions, we simply compared between the database answers and the user’s answers. We conducted several experiments to check the accuracy of score generation by our system and found that our system can generate 100% accurate scores for MCQ questions and more than 90% accurate scores from text based questions.

Keywords


Multiple Choice Questions, Automated Scoring, Answer Analysis, Experimental Analysis.

References