Combining Sentiment-Combined Model with Pre-Trained BERT Models for Sentiment Analysis 


Vol. 48,  No. 7, pp. 815-824, Jul.  2021
10.5626/JOK.2021.48.7.815


PDF

  Abstract

It is known that BERT can capture various linguistic knowledge from raw text via language modeling without using any additional hand-crafted features. However, some studies have shown that BERT-based models with an additional use of specific language knowledge have higher performance for natural language processing problems associated with that knowledge. Based on such finding, we trained a sentiment-combined model by adding sentiment features to the BERT structure. We constructed sentiment feature embeddings using sentiment polarity and intensity values annotated in a Korean sentiment lexicon and proposed two methods (external fusing and knowledge distillation) to combine sentiment-combined model with a general-purpose BERT pre-trained model. The external fusing method resulted in higher performances in Korean sentiment analysis tasks with movie reviews and hate speech datasets than baselines from other pre-trained models not fused with sentiment-combined models. We also observed that adding sentiment features to the BERT structure improved the model’s language modeling and sentiment analysis performance. Furthermore, when implementing sentiment-combined models, training time and cost could be decreased by using a small-scale BERT model with a small number of layers, dimensions, and steps.


  Statistics
Cumulative Counts from November, 2022
Multiple requests among the same browser session are counted as one view. If you mouse over a chart, the values of data points will be shown.


  Cite this article

[IEEE Style]

S. Lee and H. Shin, "Combining Sentiment-Combined Model with Pre-Trained BERT Models for Sentiment Analysis," Journal of KIISE, JOK, vol. 48, no. 7, pp. 815-824, 2021. DOI: 10.5626/JOK.2021.48.7.815.


[ACM Style]

Sangah Lee and Hyopil Shin. 2021. Combining Sentiment-Combined Model with Pre-Trained BERT Models for Sentiment Analysis. Journal of KIISE, JOK, 48, 7, (2021), 815-824. DOI: 10.5626/JOK.2021.48.7.815.


[KCI Style]

이상아, 신효필, "감정 분석을 위한 BERT 사전학습모델과 추가 자질 모델의 결합," 한국정보과학회 논문지, 제48권, 제7호, 815~824쪽, 2021. DOI: 10.5626/JOK.2021.48.7.815.


[Endnote/Zotero/Mendeley (RIS)]  Download


[BibTeX]  Download



Search




Journal of KIISE

  • ISSN : 2383-630X(Print)
  • ISSN : 2383-6296(Electronic)
  • KCI Accredited Journal

Editorial Office

  • Tel. +82-2-588-9240
  • Fax. +82-2-521-1352
  • E-mail. chwoo@kiise.or.kr