PrefixLM for Korean Text Summarization 


Vol. 49,  No. 6, pp. 475-487, Jun.  2022
10.5626/JOK.2022.49.6.475


PDF

  Abstract

In this paper, we examine the effectiveness of PrefixLM that consists of half of the parameters of the T5"s encoder-decoder architecture for Korean text generation tasks. Different from T5 where input and output sequences are separately provided, the transformer block of PrefixLM takes a single sequence that concatenates both input and output sequences. By designing the attention mask, PrefixLM performs uni- and bi-directional attentions on input and output sequences, respectively, thereby enabling to perform two roles of encoder and decoder with a single transformer block. Experiment results on Korean abstractive document summarization task show that PrefixLM leads to performance increases of 2.17 and 2.78 more than 2 in Rouge-F1 score over BART and T5, respectively, implying that the PrefixLM is promising in Korean text generation tasks.


  Statistics
Cumulative Counts from November, 2022
Multiple requests among the same browser session are counted as one view. If you mouse over a chart, the values of data points will be shown.


  Cite this article

[IEEE Style]

K. Lee, S. Na, J. Lim, T. Kim, D. Chang, "PrefixLM for Korean Text Summarization," Journal of KIISE, JOK, vol. 49, no. 6, pp. 475-487, 2022. DOI: 10.5626/JOK.2022.49.6.475.


[ACM Style]

Kun-Hui Lee, Seung-Hoon Na, Joon-Ho Lim, Tae-Hyeong Kim, and Du-Seong Chang. 2022. PrefixLM for Korean Text Summarization. Journal of KIISE, JOK, 49, 6, (2022), 475-487. DOI: 10.5626/JOK.2022.49.6.475.


[KCI Style]

이건희, 나승훈, 임준호, 김태형, 장두성, "PrefixLM에 기반한 한국어 텍스트 요약," 한국정보과학회 논문지, 제49권, 제6호, 475~487쪽, 2022. DOI: 10.5626/JOK.2022.49.6.475.


[Endnote/Zotero/Mendeley (RIS)]  Download


[BibTeX]  Download



Search




Journal of KIISE

  • ISSN : 2383-630X(Print)
  • ISSN : 2383-6296(Electronic)
  • KCI Accredited Journal

Editorial Office

  • Tel. +82-2-588-9240
  • Fax. +82-2-521-1352
  • E-mail. chwoo@kiise.or.kr