TY - JOUR T1 - Korean Text Summarization using MASS with Relative Position Representation AU - Jung, Youngjun AU - Hwang, Hyunsun AU - Lee, Changki JO - Journal of KIISE, JOK PY - 2020 DA - 2020/1/14 DO - 10.5626/JOK.2020.47.9.873 KW - text summarization KW - pre-training KW - relative position representation AB - In the language generation task, deep learning-based models that generate natural languages using a Sequence-to-Sequence model are actively being studied. In the field of text summarization, wherein the method of extracting only the core sentences from the text is used, an abstract summarization study is underway. Recently, a transfer learning method of fine-tuning using pre-training model based on large amount of monolingual data such as BERT and MASS has been mainly studied in the field of natural language processing. In this paper, after pre-training for the Korean language generation using MASS, it was applied to the summarization of the Korean text. As a result of the experiment, the Korean text summarization model using MASS was higher performance than the existing models. Additionally, the performance of the text summarization model was improved by applying the relative position representation method to MASS.