Digital Library[ Search Result ]
Korean Semantic Role Labeling with BERT
Jangseong Bae, Changki Lee, Soojong Lim, Hyunki Kim
http://doi.org/10.5626/JOK.2020.47.11.1021
Semantic role labeling is an application of natural language processing to identify relationships such as "who, what, how and why" with in a sentence. The semantic role labeling study mainly uses machine learning algorithms and the end-to-end method that excludes feature information. Recently, a language model called BERT (Bidirectional Encoder Representations from Transformers) has emerged in the natural language processing field, performing better than the state-of- the-art models in the natural language processing field. The performance of the semantic role labeling study using the end-to-end method is mainly influenced by the structure of the machine learning model or the pre-trained language model. Thus, in this paper, we apply BERT to the Korean semantic role labeling to improve the Korean semantic role labeling performance. As a result, the performance of the Korean semantic role labeling model using BERT is 85.77%, which is better than the existing Korean semantic role labeling model.
English-to-Korean Machine Translation using Image Information
Jangseong Bae, Hyunsun Hwang, Changki Lee
http://doi.org/10.5626/JOK.2019.46.7.690
Machine translation automatically converts a text in one language into another language. Conventional machine translations use only texts for translation which is a disadvantage in that various information related to input text cannot be utilized. In recent years, multimodal machine translation models have emerged that use images related to input text as additional inputs, unlike conventional machine translations which use only textual data. In this paper, image information was added at decoding time of machine translation according to recent research trends and used for English-to-Korean automated translation. In addition, we propose a model with a decoding gate to adjust the textual and image information at the decoding time. Our experimental results show that the proposed method resulted in better performance than the non-gated model.
Malware Detection Model with Skip-Connected LSTM RNN
Jangseong Bae, Changki Lee, Suno Choi, Jonghyun Kim
http://doi.org/10.5626/JOK.2018.45.12.1233
A program can be viewed as a sequence of consecutive Opcodes in which malware is a malicious program. In this paper, we assume that the program is a sequence of Opcodes with semantic information and detect the malware using the Long Short-Term Memory Recurrent Neural Network (LSTM RNN), which is a deep learning model suitable for sequence data modeling. For various experiments, the Opcode sequence is divided into a uni-gram sequence and a tri-gram sequence and used as the input features of the various deep learning models. Several deep learning models use the input Opcodes sequence to determine whether the program is a normal file or malware. We also show that the proposed Skip-Connected LSTM RNN model is superior to the LSTM encoder and the Convolutional Neural Network(CNN) model for malware detection. Experimental results show that the Skip-Connected LSTM RNN model has better performance than the LSTM encoder and CNN model in the Opcode sequence tri-gram data.
Korean Semantic Role Labeling using Stacked Bidirectional LSTM-CRFs
Syntactic information represents the dependency relation between predicates and arguments, and it is helpful for improving the performance of Semantic Role Labeling systems. However, syntax analysis can cause computational overhead and inherit incorrect syntactic information. To solve this problem, we exclude syntactic information and use only morpheme information to construct Semantic Role Labeling systems. In this study, we propose an end-to-end SRL system that only uses morpheme information with Stacked Bidirectional LSTM-CRFs model by extending the LSTM RNN that is suitable for sequence labeling problem. Our experimental results show that our proposed model has better performance, as compare to other models.
Search

Journal of KIISE
- ISSN : 2383-630X(Print)
- ISSN : 2383-6296(Electronic)
- KCI Accredited Journal
Editorial Office
- Tel. +82-2-588-9240
- Fax. +82-2-521-1352
- E-mail. chwoo@kiise.or.kr