Usage
  • 328 views
  • 417 downloads

Automated Essay Scoring Framework for a Multilingual Medical Licensing Examination

  • Author / Creator
    Latifi, Syed Muhammad Fahad
  • Automated essay scoring (AES) is a technology that efficiently and economically score written responses by emulating intelligence of human scorer. Present study had employed open-source Natural Language Processing technologies for developing AES framework, to score multilingual medical licensing examination. English, French, and translated-French responses of constructed-response items were scored automatically, and the strength of multilingual automated scoring framework were evaluated in relation to human scoring. Machine-translation was also contextualized for raising AES performance, when restricted sample size counters the performance of AES software. Specific feature extraction and model building strategies resulted in high concordance between AES and human scoring, with average maximum human-machine accuracy of 95.7%, which was in almost perfect agreement with human markers. Results also revealed that the machine-translator had raised predictive consistency but negatively influenced the predictive accuracy. Implications of results for practice, as well as directions for future research are also presented.

  • Subjects / Keywords
  • Graduation date
    Spring 2014
  • Type of Item
    Thesis
  • Degree
    Master of Education
  • DOI
    https://doi.org/10.7939/R3MP4VV1W
  • License
    This thesis is made available by the University of Alberta Libraries with permission of the copyright owner solely for non-commercial purposes. This thesis, or any portion thereof, may not otherwise be copied or reproduced without the written consent of the copyright owner, except to the extent permitted by Canadian copyright law.