Search : [ keyword: intention analysis ] (2)

Semantic Similarity-based Intent Analysis using Pre-trained Transformer for Natural Language Understanding

Sangkeun Jung, Hyein Seo, Hyunji Kim, Taewook Hwang

http://doi.org/10.5626/JOK.2020.47.8.748

Natural language understanding (NLU) is a central technique applied to developing robot, smart messenger, and natural interface. In this study, we propose a novel similarity-based intent analysis method instead of the typical classification methods for intent analysis problems in the NLU. To accomplish this, the neural network-based text and semantic frame readers are introduced to learn semantic vectors using pairwise text-semantic frame instances. The text to vector and the semantic frame to vector projection methods using the pre-trained transformer are proposed. Then, we propose a method of attaching the intention tag of the nearest training sentence to the query sentence by measuring the semantic vector distances in the vector space. Four experiments on the natural language learning suggest that the proposed method demonstrates superior performance compared to the existing intention analysis techniques. These four experiments use natural language corpora in Korean and English. The two experiments in Korean are weather and navigation language corpora, and the two English-based experiments involve air travel information systems and voice platform language corpora.

Speakers’ Intention Analysis Based on Partial Learning of a Shared Layer in a Convolutional Neural Network

Minkyoung Kim, Harksoo Kim

http://doi.org/10.5626/JOK.2017.44.12.1252

In dialogues, speakers’ intentions can be represented by sets of an emotion, a speech act, and a predicator. Therefore, dialogue systems should capture and process these implied characteristics of utterances. Many previous studies have considered such determination as independent classification problems, but others have showed them to be associated with each other. In this paper, we propose an integrated model that simultaneously determines emotions, speech acts, and predicators using a convolution neural network. The proposed model consists of a particular abstraction layer, mutually independent informations of these characteristics are abstracted. In the shared abstraction layer, combinations of the independent information is abstracted. During training, errors of emotions, errors of speech acts, and errors of predicators are partially back-propagated through the layers. In the experiments, the proposed integrated model showed better performances (2%p in emotion determination, 11%p in speech act determination, and 3%p in predicator determination) than independent determination models.


Search




Journal of KIISE

  • ISSN : 2383-630X(Print)
  • ISSN : 2383-6296(Electronic)
  • KCI Accredited Journal

Editorial Office

  • Tel. +82-2-588-9240
  • Fax. +82-2-521-1352
  • E-mail. chwoo@kiise.or.kr