Solving for Redundant Repetition Problem of Generating Summarization using Decoding History 


Vol. 46,  No. 6, pp. 535-543, Jun.  2019
10.5626/JOK.2019.46.6.535


PDF

  Abstract

Neural attentional sequence-to-sequence models have achieved great success in abstractive summarization. However, the model is limited by several challenges including repetitive generation of words, phrase and sentences in the decoding step. Many studies have attempted to address the problem by modifying the model structure. Although the consideration of actual history of word generation is crucial to reduce word repetition, these methods, however, do not consider the decoding history of generated sequence. In this paper, we propose a new loss function, called ‘Repeat Loss’ to avoid repetitions. The Repeat Loss directly prevents the model from repetitive generation of words by giving a loss penalty to the generation probability of words already generated in the decoding history. Since the propose Repeat Loss does not need a special network structure, the loss function is applicable to any existing sequence-to-sequence models. In experiments, we applied the Repeat Loss to a number of sequence-to-sequence model based summarization systems and trained them on both Korean and CNN/Daily Mail summarization datasets. The results demonstrate that the proposed method reduced repetitions and produced high-quality summarization.


  Statistics
Cumulative Counts from November, 2022
Multiple requests among the same browser session are counted as one view. If you mouse over a chart, the values of data points will be shown.


  Cite this article

[IEEE Style]

J. Ryu, Y. Noh, S. J. Choi, S. Park, S. Park, "Solving for Redundant Repetition Problem of Generating Summarization using Decoding History," Journal of KIISE, JOK, vol. 46, no. 6, pp. 535-543, 2019. DOI: 10.5626/JOK.2019.46.6.535.


[ACM Style]

Jaehyun Ryu, Yunseok Noh, Su Jeong Choi, Seyoung Park, and Seong-Bae Park. 2019. Solving for Redundant Repetition Problem of Generating Summarization using Decoding History. Journal of KIISE, JOK, 46, 6, (2019), 535-543. DOI: 10.5626/JOK.2019.46.6.535.


[KCI Style]

류재현, 노윤석, 최수정, 박세영, 박성배, "단어 생성 이력을 이용한 요약문 생성의 어휘 반복 문제 해결," 한국정보과학회 논문지, 제46권, 제6호, 535~543쪽, 2019. DOI: 10.5626/JOK.2019.46.6.535.


[Endnote/Zotero/Mendeley (RIS)]  Download


[BibTeX]  Download



Search




Journal of KIISE

  • ISSN : 2383-630X(Print)
  • ISSN : 2383-6296(Electronic)
  • KCI Accredited Journal

Editorial Office

  • Tel. +82-2-588-9240
  • Fax. +82-2-521-1352
  • E-mail. chwoo@kiise.or.kr