Abstract:
Community Question Answering (CQA) system provides new interesting research directions to traditional question Answering (QA) field (the exploitation of the interaction between users and the structure of related posts). In recent years, crowdsourcing has become essential in a wide range of web applications. One of the biggest challenges of crowdsourcing is the quality of crowd answers as workers (Answerers) have wide ranging levels of expertise and the worker community may contain incompetent workers. Although, various techniques for quality control have been proposed, a processing phase in which the crowd answers are being validated is still required and that is what this work addresses. The main objective of this research work is to develop a system that validates crowd answer using their personal attributes. Answer Validation is typically conducted by integrating a weighted system through which weights are assigned to various answerers’ attributes which includes grade point, years of experience, Level of computing, course of study, area of specialization, question understandability and answerers’ confidence level, in order to validate the quality of the answer provided by the answerer with respect to the quota which is the minimum weight required for a valid candidate answer by the
system. The candidate answers are then ranked by the crowd using Borda Count ranking
algorithm. The system is evaluated by the User via questionnaire, twenty (20) questions are formed from the ISO/IEC standard metrics and the user are required to state the rate at which the answers provided by the system is correct, valid, quality, and so on. Thirty (30) questionnaires were completed and it was rated using four-level rating scale, ratings obtained were analyzed using weight means techniques and the answer quality, correctness, effectiveness are rated up to 93%. The evaluation results on the validated answer from the computing-related questions demonstrate the effectiveness of the system.