Search : [ keyword: deep-learning ] (7)

Approximating the Accuracy of Classification Models Using Self-differential Testing

Jubin Lee, Taeho Kim, Yu-Seung Ma

http://doi.org/10.5626/JOK.2022.49.12.1143

Differential testing is a traditional software testing technique that detects errors by observing whether similar applications generate different outputs for the same input. Differential testing is also used in artificial intelligence systems. Existing research involves the cost of finding a high-quality reference neural network with the same function as the target neural network but different architectures. We propose a self-differential testing technique that evaluates a classification model by making a reference model using a target neural network without the need to find the neural network of another architecture when differential testing. Experiments confirmed that self-differential testing produced similar effects at a lower cost than the existing research that requires other reference models. In addition, we propose an accuracy approximation method for classification models using self-differential analysis, which is an application of self-differential testing. The approximate accuracy through self-differential testing was confirmed to show a small difference of 0.0002 to 0.09 from the actual accuracy in experiments using similar datasets of MNIST and CIFAR10.

Prediction of Blood Glucose in Diabetic Inpatients Using LSTM Neural Network

Sang Hyeon Kim, Han Beom Lee, Seong Wan Jeon, Dae Yeon Kim, Sang Jeong Lee

http://doi.org/10.5626/JOK.2020.47.12.1120

Diabetes is a chronic disease that causes serious complications, and at the medical site, doctors predict future changes in blood glucose based on patients past blood glucose trends and implement medical treatment. Recently, a CGM(Continuous Glucose Monitoring) measuring device has been introduced that can automatically measure blood glucose every five minutes to monitor continuous changes in blood glucose, and it is widely used in clinical applications. Based on the results of CGM blood glucose, the doctors predict and treat the timing of insulin administration and high risk of diabetes patients. In this paper, the blood glucose prediction model based on deep learning neural network is proposed. The proposed model is designed with an LSTM (Long Short-Term Memory) based neural network. It is designed to take historical blood glucose data as well as variables such as HbA1c(glycated hemoglobin) and BMI(body mass index). It was applied and tested using CGM blood glucose data from Type 2 Diabetes inpatients at a university hospital. The proposed model which patient characteristics show50% improvement at maximum in blood glucose prediction accuracy over the LSTM model of previous study.

Automatic Text Summarization Based on Selective OOV Copy Mechanism with BERT Embedding

Tae-Seok Lee, Seung-Shik Kang

http://doi.org/10.5626/JOK.2020.47.1.36

Automatic text summarization is a process of shortening a text document via extraction or abstraction. Abstractive text summarization involves using pre-generated word embedding information. Low-frequency but salient words such as terminologies are seldom included in dictionaries, that are so called, out-of-vocabulary (OOV) problems. OOV deteriorates the performance of the encoder-decoder model in the neural network. To address OOV words in abstractive text summarization, we propose a copy mechanism to facilitate copying new words in the target document and generating summary sentences. Different from previous studies, the proposed approach combines accurately pointing information, selective copy mechanism, embedded by BERT, randomly masking OOV, and converting sentences from morpheme. Additionally, the neural network gate model to estimate the generation probability and the loss function to optimize the entire abstraction model was applied. Experimental results demonstrate that ROUGE-1 (based on word recall) and ROUGE-L (longest used common subsequence) of the proposed encoding-decoding model have been improved at 54.97 and 39.23, respectively.

Sports Broadcasting with Deep Learning

Byeong Jo Kim, Yong Suk Choi

http://doi.org/10.5626/JOK.2019.46.10.1020

Sports broadcasting requires understanding and reasoning of a current situation based on information regarding sports scenes, players, and past knowledge. In this paper, we introduced how scene classifier, player detector, motion recognizer could be used to obtain information on sports images and understand current situations. We created three types of commentaries. One was from web data, another was from 13 scenes with scene classifier, and the other was generated by the position of the players, eight motions, and the ontology. Data from the KBO (Korea Baseball Organization League) games from April 1, 2018, to April 14, 2018, were directly labeled to learn the model.

Malware Detection Model with Skip-Connected LSTM RNN

Jangseong Bae, Changki Lee, Suno Choi, Jonghyun Kim

http://doi.org/10.5626/JOK.2018.45.12.1233

A program can be viewed as a sequence of consecutive Opcodes in which malware is a malicious program. In this paper, we assume that the program is a sequence of Opcodes with semantic information and detect the malware using the Long Short-Term Memory Recurrent Neural Network (LSTM RNN), which is a deep learning model suitable for sequence data modeling. For various experiments, the Opcode sequence is divided into a uni-gram sequence and a tri-gram sequence and used as the input features of the various deep learning models. Several deep learning models use the input Opcodes sequence to determine whether the program is a normal file or malware. We also show that the proposed Skip-Connected LSTM RNN model is superior to the LSTM encoder and the Convolutional Neural Network(CNN) model for malware detection. Experimental results show that the Skip-Connected LSTM RNN model has better performance than the LSTM encoder and CNN model in the Opcode sequence tri-gram data.

Research on Joint Models for Korean Word Spacing and POS (Part-Of-Speech) Tagging based on Bidirectional LSTM-CRF

Seon-Wu Kim, Sung-Pil Choi

http://doi.org/10.5626/JOK.2018.45.8.792

In general, Korean part-of-speech tagging is done on a sentence in which the spacing is completed by a word as an input. In order to process a sentence that is not properly spaced, automatic spacing is needed to correct the error. However, if the automatic spacing and the parts tagging are sequentially performed, a serious performance degradation may result from an error occurring at each step. In this study, we try to solve this problem by constructing an integrated model that can perform automatic spacing and POS(Part-Of-Speech) tagging simultaneously. Based on the Bidirectional LSTM-CRF model, we propose an integrated model that can simultaneously perform syllable-based word spacing and POS tagging complementarily. In the experiments using a Sejong tagged text, we obtained 98.77% POS tagging accuracy for the completely spaced sentences, and 97.92% morpheme accuracy for the sentences without any word spacing.

Korean Semantic Role Labeling using Stacked Bidirectional LSTM-CRFs

Jangseong Bae, Changki Lee

http://doi.org/

Syntactic information represents the dependency relation between predicates and arguments, and it is helpful for improving the performance of Semantic Role Labeling systems. However, syntax analysis can cause computational overhead and inherit incorrect syntactic information. To solve this problem, we exclude syntactic information and use only morpheme information to construct Semantic Role Labeling systems. In this study, we propose an end-to-end SRL system that only uses morpheme information with Stacked Bidirectional LSTM-CRFs model by extending the LSTM RNN that is suitable for sequence labeling problem. Our experimental results show that our proposed model has better performance, as compare to other models.


Search




Journal of KIISE

  • ISSN : 2383-630X(Print)
  • ISSN : 2383-6296(Electronic)
  • KCI Accredited Journal

Editorial Office

  • Tel. +82-2-588-9240
  • Fax. +82-2-521-1352
  • E-mail. chwoo@kiise.or.kr