文章詳目資料

International Journal of Computational Linguistics And Chinese Language Processing THCI

  • 加入收藏
  • 下載文章
篇名 Correcting Serial Grammatical Errors based on N-grams and Syntax
卷期 18:4
作者 Jian-cheng WuJim ChangJason S. Chang
頁次 031-044
關鍵字 Grammatical Error CorrectionSerial ErrorsMachine TranslationN-gramsLanguage ModelTHCI Core
出刊日期 201312

中文摘要

英文摘要

In this paper, we present a new method based on machine translation for correcting serial grammatical errors in a given sentence in learners’ writing. In our approach, translation models are generated to translate the input into a grammatical sentence. The method involves automatically learning two translation models that are based on Web-scale n-grams. The first model translates trigrams containing serial preposition-verb errors into correct ones. The second model is a back-off model, used in the case where the trigram is not found in the training data. At run-time, the phrases in the input are matched and translated, and ranking is performed on all possible translations to produce a corrected sentence as output. Evaluation on a set of sentences in a learner corpus shows that the method corrects serial errors reasonably well. Our methodology exploits the state-of-the art in machine translation, resulting in an effective system that can deal with many error types at the same time.

相關文獻