Search : [ author: 최용석 ] (4)

Performance Analysis of Korean Morphological Analyzer based on Transformer and BERT

Yongseok Choi, Kong Joo Lee

http://doi.org/10.5626/JOK.2020.47.8.730

This paper introduces a Korean morphological analyzer using the Transformer, which is one of the most popular sequence-to-sequence deep neural models. The Transformer comprises an encoder and a decoder. The encoder compresses a raw input sentence into a fixed-size vector, while the decoder generates a morphological analysis result for the vector. We also replace the encoder with BERT, a pre-trained language representation model. An attention mechanism and a copying mechanism are integrated in the decoder. The processing units of the encoder and the decoder are eojeol-based WordPiece and morpheme-based WordPiece, respectively. Experimental results showed that the Transformer with fine-tuned BERT outperforms the randomly initialized Transformer by 2.9% in the F1 score. We also investigated the effects of the WordPiece embedding on morphological analysis when they are not fully updated in the training phases.

Sports Broadcasting with Deep Learning

Byeong Jo Kim, Yong Suk Choi

http://doi.org/10.5626/JOK.2019.46.10.1020

Sports broadcasting requires understanding and reasoning of a current situation based on information regarding sports scenes, players, and past knowledge. In this paper, we introduced how scene classifier, player detector, motion recognizer could be used to obtain information on sports images and understand current situations. We created three types of commentaries. One was from web data, another was from 13 scenes with scene classifier, and the other was generated by the position of the players, eight motions, and the ontology. Data from the KBO (Korea Baseball Organization League) games from April 1, 2018, to April 14, 2018, were directly labeled to learn the model.

Korean Dependency Parser using Higher-order features and Stack-Pointer Networks

Yong-seok Choi, Kong Joo Lee

http://doi.org/10.5626/JOK.2019.46.7.636

Syntactic parsing is carried out to analyze a syntactic structure and resolve syntactic ambiguities in an input sentence. In general, the Korean language has relatively free word order and frequent omission of nouns such as subjects or objects. Therefore, dependency parsers are known to be suitable for parsing the Korean language. A stack-pointer network is a combination of a pointer network and internal stacks. The network reads and encodes a whole input sentence, and builds a dependency tree top-down in a depth-first fashion. In this paper, we employed a stack-pointer network for parsing the Korean language and utilized higher-order information to benefit from all the previously derived subtree structures. Experimental results revealed that the dependency parser with a sibling node as higher-order features led to Unlabeled Attachment Score(UAS) of 92.63% accuracy.

Improving Recurrent Neural Network based Recommendations by Utilizing Embedding Matrix

Myung Ha Kwon, Sung Eon Kong, Yong Suk Choi

http://doi.org/10.5626/JOK.2018.45.7.659

Recurrent neural networks(RNNs) have recently been successfully applied to recommendation tasks. RNNs were adopted by session-based recommendation, which recommends items by the records only within a session, and a movie recommendation that recommends movies to the users by analyzing the consumption records collected through multiple accesses to the websites. The new approaches showed improvements over traditional approaches for both tasks where only implicit feedback such as clicks or purchase records are available. In this work, we propose the application of weight-tying to improve the existing movie recommendation model based on RNNs. We also perform experiments with an incremental recommendation method to more precisely evaluate the performance of recommendation models.


Search




Journal of KIISE

  • ISSN : 2383-630X(Print)
  • ISSN : 2383-6296(Electronic)
  • KCI Accredited Journal

Editorial Office

  • Tel. +82-2-588-9240
  • Fax. +82-2-521-1352
  • E-mail. chwoo@kiise.or.kr