TY - JOUR T1 - Korean Dependency Parsing using the Self-Attention Head Recognition Model AU - Lim, Joon-Ho AU - Kim, Hyun-ki JO - Journal of KIISE, JOK PY - 2019 DA - 2019/1/14 DO - 10.5626/JOK.2019.46.1.22 KW - dependency parsing KW - deep learning KW - natural language processing AB - Dependency parsing is the problem solving of structural ambiguities of natural language in sentences. Recently, various deep learning techniques have been applied and shown high performance. In this paper, we analyzed deep learning based dependency parsing problem in three stages. The first stage was a representation step for a word (eojeol) that is a unit of dependency parsing. The second stage was a context reflecting step that reflected the surrounding word information for each word. The last stage was the head word and dependency label recognition step. In this paper, we propose the max-pooling method that is widely used in the CNN model for a word representation. Moreover, we apply the Minimal-RNN Unit that has less computational complexity than the LSTM and GRU for contextual representation. Finally, we propose a Self-Attention Head Recognition Model that includes the relative distance embedding between each word for the head word recognition, and applies multi-task learning to the dependency label recognition simultaneously. For the evaluation, the SEJONG phrase-structure parsing corpus was transformed according to the TTA Standard Dependency Guideline. The proposed model showed the accuracy of parsing for UAS 93.38% and LAS 90.42%.