TY - JOUR T1 - Korean Text Summarization using MASS with Copying and Coverage Mechanism and Length Embedding AU - Jung, Youngjun AU - Lee, Changki AU - Go, Wooyoung AU - Yoon, Hanjun JO - Journal of KIISE, JOK PY - 2022 DA - 2022/1/14 DO - 10.5626/JOK.2022.49.1.25 KW - text summarization KW - pre-training KW - MASS KW - copying mechanism KW - coverage mechanism KW - length embedding AB - Text summarization is a technology that generates a summary including important and essential information from a given document, and an end-to-end abstractive summarization model using a sequence-to-sequence model is mainly studied. Recently, a transfer learning method that performs fine-tuning using a pre-training model based on large-scale monolingual data has been actively studied in the field of natural language processing. In this paper, we applied the copying mechanism method to the MASS model, conducted pre-training for Korean language generation, and then applied it to Korean text summarization. In addition, coverage mechanism and length embedding were additionally applied to improve the summarization model. As a result of the experiment, it was shown that the Korean text summarization model, which applied the copying and coverage mechanism method to the MASS model, showed a higher performance than the existing models, and that the length of the summary could be adjusted through length embedding.