Digital Library[ Search Result ]
An Efficient and Differentially Private K-Means Clustering Algorithm Using the Voronoi Diagram
http://doi.org/10.5626/JOK.2020.47.9.879
Studies have been recently conducted on preventing the leakage of personal information from the analysis results of data. Among them, differential privacy is a widely studied standard since it guarantees rigorous and provable privacy preservation. In this paper, we propose an algorithm based on the Voronoi diagram to publish the results of the K-means clustering for 2D data while guaranteeing the differential privacy. Existing algorithms have a disadvantage in that it is difficult to select the number of samples for the data since the running time and the accuracy of the clustering results may change according to the number of samples. The proposed algorithm, however, could quickly provide an accurate clustering result without requiring such a parameter. We also demonstrate the performance of the proposed algorithm through experiments using real-life data.
Hybrid Word-Character Neural Network Model for the Improvement of Document Classification
http://doi.org/10.5626/JOK.2017.44.12.1290
Document classification, a task of classifying the category of each document based on text, is one of the fundamental areas for natural language processing. Document classification may be used in various fields such as topic classification and sentiment classification. Neural network models for document classification can be divided into two categories: word-level models and character-level models that treat words and characters as basic units respectively. In this study, we propose a neural network model that combines character-level and word-level models to improve performance of document classification. The proposed model extracts the feature vector of each word by combining information obtained from a word embedding matrix and information encoded by a character-level neural network. Based on feature vectors of words, the model classifies documents with a hierarchical structure wherein recurrent neural networks with attention mechanisms are used for both the word and the sentence levels. Experiments on real life datasets demonstrate effectiveness of our proposed model.
Search

Journal of KIISE
- ISSN : 2383-630X(Print)
- ISSN : 2383-6296(Electronic)
- KCI Accredited Journal
Editorial Office
- Tel. +82-2-588-9240
- Fax. +82-2-521-1352
- E-mail. chwoo@kiise.or.kr