文章詳目資料

Journal of Computers EIMEDLINEScopus

  • 加入收藏
  • 下載文章
篇名 A BERT Based Single Document Extractive Summarization Model
卷期 31:2
作者 Wei LiuPei-Ran SongRui-Li Jiao
頁次 241-249
關鍵字 BERTextractive summarizationsentence position embeddingsingle documentEIMEDLINEScopus
出刊日期 202004
DOI 10.3966/199115992020043102020

中文摘要

英文摘要

BERT, a pre-trained Transformer model, has already become one of the most common model in multiple natural language processing (NLP) tasks. It has been customized for extractive summarization via the fine-tuned BERTSUM model. Different from the other NLP tasks, extractive summarization relies heavily on the sentence position information at the document level. However, this crucial feature has not been fully studied in the existing models, either BERT or BERTSUM. In this paper, we propose a novel single document extractive summarization model, which incorporate the sentence positions through an extra documental position embedding module. The proposed model has been tested on the well-known CNN/DaliyMail dataset. Results show that the performance of our model is competitively against the state-of-the-art models on this task. Ablation experiments prove that the quality of the extracted summary can be improved by adding the documental sentence position embedding module.

本卷期文章目次

相關文獻