TY - JOUR T1 - Topic Centric Korean Text Summarization using Attribute Model AU - Yoon, Su-Hwan AU - Kim, A-Yeong AU - Park, Seong-Bae JO - Journal of KIISE, JOK PY - 2021 DA - 2021/1/14 DO - 10.5626/JOK.2021.48.6.688 KW - machine learning KW - pre-training KW - MASS KW - PPLM AB - Abstractive summarization takes original text as an input and generates a summary containing the core-information about the original text. The abstractive summarization model is mainly designed by the Sequence-to-Sequence model. To improve quality as well as coherence of summary, the topic-centric methods which contain the core information of the original text are recently proposed. However, the previous methods perform additional training steps which make it difficult to take advantage of the pre-trained language model. This paper proposes a topic-centric summarizer that can reflect topic words to a summary as well as retain the characteristics of language model by using PPLM. The proposed method does not require any additional training. To prove the effectiveness of the proposed summarizer, this paper performed summarization experiments with Korean newspaper data.