Digital Library[ Search Result ]
Combining Sentiment-Combined Model with Pre-Trained BERT Models for Sentiment Analysis
http://doi.org/10.5626/JOK.2021.48.7.815
It is known that BERT can capture various linguistic knowledge from raw text via language modeling without using any additional hand-crafted features. However, some studies have shown that BERT-based models with an additional use of specific language knowledge have higher performance for natural language processing problems associated with that knowledge. Based on such finding, we trained a sentiment-combined model by adding sentiment features to the BERT structure. We constructed sentiment feature embeddings using sentiment polarity and intensity values annotated in a Korean sentiment lexicon and proposed two methods (external fusing and knowledge distillation) to combine sentiment-combined model with a general-purpose BERT pre-trained model. The external fusing method resulted in higher performances in Korean sentiment analysis tasks with movie reviews and hate speech datasets than baselines from other pre-trained models not fused with sentiment-combined models. We also observed that adding sentiment features to the BERT structure improved the model’s language modeling and sentiment analysis performance. Furthermore, when implementing sentiment-combined models, training time and cost could be decreased by using a small-scale BERT model with a small number of layers, dimensions, and steps.
Search

Journal of KIISE
- ISSN : 2383-630X(Print)
- ISSN : 2383-6296(Electronic)
- KCI Accredited Journal
Editorial Office
- Tel. +82-2-588-9240
- Fax. +82-2-521-1352
- E-mail. chwoo@kiise.or.kr