TY - JOUR T1 - Korean Machine Reading Comprehension with S²-Net AU - Park, Cheoneum AU - Lee, Changki AU - Hong, Sulyn AU - Hwang, Yigyu AU - Yoo, Taejoon AU - Kim, Hyunki JO - Journal of KIISE, JOK PY - 2018 DA - 2018/1/14 DO - 10.5626/JOK.2018.45.12.1260 KW - machine reading comprehension KW - question answering KW - simple recurrent unit KW - self-matching network KW - Korean machine reading comprehension dataset AB - Machine reading comprehension is the task of understanding a given context and identifying the right answer in context. Simple recurrent unit (SRU) solves the vanishing gradient problem in recurrent neural network (RNN) by using neural gate such as gated recurrent unit (GRU), and removes previous hidden state from gate input to improve speed. Self-matching network is used in r-net, and this has a similar effect as coreference resolution can show similar semantic context information by calculating attention weight for its RNN sequence. In this paper, we propose a S²-Net model that add self-matching layer to an encoder using stacked SRUs and constructs a Korean machine reading comprehension dataset. Experimental results reveal the proposed S²-Net model has EM 70.81% and F1 82.48% performance in Korean machine reading comprehension.