Search : [ author: Je-Min Kim ] (7)

An Ontology-based Description Approach to Temporal Transitions of Events

Je-Min Kim, Young-Tack Park, Sang-Min Kim, Yukyung Shin

http://doi.org/10.5626/JOK.2023.50.6.484

It is very useful to describe the information that changes between events and entities that occur over time in a standardized form. Various studies have used ontology to describe the time information of events for this purpose. The concept of lineage makes it possible to effectively describe the state transition and connectivity of events over time. In this paper, we propose a method of utilizing ontology-based event lineage and entity lineage to describe the time-dependent transformation process of events and entities in a standardized form. First, the time format was classified into instant, interval, duration, and periodic and expressed as an ontology instance, and each event had a one-time format. Then, the event and entity information expressed in the ontology were described in lineage. The response time was improved by 15.02% in an experiment on processing temporal relation queries to verify the relevance and usefulness of this study using Allen’s Temporal Relation.

An Approach to a Learning Prediction Model for Recognition of Daily Life Pattern based on Event Calculus

Seok-Hyun Bae, Sung-hyuk Bang, Hyun-Kyu Park, Myung-Joong Jeon, Je-Min Kim, Young-Tack Park

http://doi.org/10.5626/JOK.2018.45.5.466

Several studies have been conducted on data analysis and predicting results with the advance of machine learning algorithms. Still, there are many problems of cleaning the noise of the real-life dataset, which is disturbing a clear recognition on complex patterns of human intention. To overcome this limitation, this paper proposes an event calculus methodology with 3 additional steps for the recognition of human intention: intention reasoning, conflict resolution, and noise reduction. Intention reasoning identifies the intention of the living pattern time-series data. In conflict resolution, existing ongoing intentions and inferred intention are checked by a conflict graph, so that the intentions that can occur in parallel are inferred. Finally, for noise reduction, the inferred intention from the noise of living pattern data is filtered by the history of fluent. For the evaluation of the event calculus module, this paper also proposes data generation methodology based on a gaussian mixture model and heuristic rules. The performance estimation was conducted with 300 sequential instances with 5 intentions that were observed for 13 hours. An accuracy of 89.3% was achieved between the probabilistic model and event calculus module.

Approach for Learning Intention Prediction Model based on Recurrent Neural Network

Sung-hyuk Bang, Seok-Hyun Bae, Hyun-Kyu Park, Myung-Joong Jeon, Je-Min Kim, Young-Tack Park

http://doi.org/10.5626/JOK.2018.45.4.360

Several studies have been conducted on human intention prediction with the help of machine learning models. However, these studies have indicated a fundamental shortcoming of machine learning models since they are unable to reflect a long span of past information. To overcome this limitation, this paper proposes a human intention prediction model based on a recurrent neural network(RNN). For performing predictions, the RNN model classifies the patterns of time-series data by reflecting previous sequence patterns of the time-series data. For performing intention prediction using the proposed model, an RNN model was trained to classify predefined intentions by using attributes such as time, location, activity and detected objects in a house. Each RNN node is composed of a long short-term memory cell to solve the long term dependency problem. To evaluate the proposed intention prediction model, a data generator based on the weighted-graph structure has been developed for generating data on a daily basis. By incorporating 23,000 data instances for training and testing the proposed intention prediction model, a prediction accuracy value of 90.52% was achieved.

An Approach of Scalable SHIF Ontology Reasoning using Spark Framework

Je-Min Kim, Young-Tack Park

http://doi.org/

For the management of a knowledge system, systems that automatically infer and manage scalable knowledge are required. Most of these systems use ontologies in order to exchange knowledge between machines and infer new knowledge. Therefore, approaches are needed that infer new knowledge for scalable ontology. In this paper, we propose an approach to perform rule based reasoning for scalable SHIF ontologies in a spark framework which works similarly to MapReduce in distributed memories on a cluster. For performing efficient reasoning in distributed memories, we focus on three areas. First, we define a data structure for splitting scalable ontology triples into small sets according to each reasoning rule and loading these triple sets in distributed memories. Second, a rule execution order and iteration conditions based on dependencies and correlations among the SHIF rules are defined. Finally, we explain the operations that are adapted to execute the rules, and these operations are based on reasoning algorithms. In order to evaluate the suggested methods in this paper, we perform an experiment with WebPie, which is a representative ontology reasoner based on a cluster using the LUBM set, which is formal data used to evaluate ontology inference and search speed. Consequently, the proposed approach shows that the throughput is improved by 28,400% (157k/sec) from WebPie(553/sec) with LUBM.

A Scalable OWL Horst Lite Ontology Reasoning Approach based on Distributed Cluster Memories

Je-Min Kim, Young-Tack Park

http://doi.org/

Current ontology studies use the Hadoop distributed storage framework to perform map-reduce algorithm-based reasoning for scalable ontologies. In this paper, however, we propose a novel approach for scalable Web Ontology Language (OWL) Horst Lite ontology reasoning, based on distributed cluster memories. Rule-based reasoning, which is frequently used for scalable ontologies, iteratively executes triple-format ontology rules, until the inferred data no longer exists. Therefore, when the scalable ontology reasoning is performed on computer hard drives, the ontology reasoner suffers from performance limitations. In order to overcome this drawback, we propose an approach that loads the ontologies into distributed cluster memories, using Spark (a memory-based distributed computing framework), which executes the ontology reasoning. In order to implement an appropriate OWL Horst Lite ontology reasoning system on Spark, our method divides the scalable ontologies into blocks, loads each block into the cluster nodes, and subsequently handles the data in the distributed memories. We used the Lehigh University Benchmark, which is used to evaluate ontology inference and search speed, to experimentally evaluate the methods suggested in this paper, which we applied to LUBM8000 (1.1 billion triples, 155 gigabytes). When compared with WebPIE, a representative mapreduce algorithm-based scalable ontology reasoner, the proposed approach showed a throughput improvement of 320% (62k/s) over WebPIE (19k/s).

MOnCa2: High-Level Context Reasoning Framework based on User Travel Behavior Recognition and Route Prediction for Intelligent Smartphone Applications

Je-Min Kim, Young-Tack Park

http://doi.org/

MOnCa2 is a framework for building intelligent smartphone applications based on smartphone sensors and ontology reasoning. In previous studies, MOnCa determined and inferred user situations based on sensor values represented by ontology instances. When this approach is applied, recognizing user space information or objects in user surroundings is possible, whereas determining the user’s physical context (travel behavior, travel destination) is impossible. In this paper, MOnCa2 is used to build recognition models for travel behavior and routes using smartphone sensors to analyze the user’s physical context, infer basic context regarding the user’s travel behavior and routes by adapting these models, and generate high-level context by applying ontology reasoning to the basic context for creating intelligent applications. This paper is focused on approaches that are able to recognize the user’s travel behavior using smartphone accelerometers, predict personal routes and destinations using GPS signals, and infer high-level context by applying realization.

Distributed Table Join for Scalable RDFS Reasoning on Cloud Computing Environment

Wan-Gon Lee, Je-Min Kim, Young-Tack Park

http://doi.org/

The Knowledge service system needs to infer a new knowledge from indicated knowledge to provide its effective service. Most of the Knowledge service system is expressed in terms of ontology. The volume of knowledge information in a real world is getting massive, so effective technique for massive data of ontology is drawing attention. This paper is to provide the method to infer massive data-ontology to the extent of RDFS, based on cloud computing environment, and evaluate its capability. RDFS inference suggested in this paper is focused on both the method applying MapReduce based on RDFS meta table, and the method of single use of cloud computing memory without using MapReduce under distributed file computing environment. Therefore, this paper explains basically the inference system structure of each technique, the meta table set-up according to RDFS inference rule, and the algorithm of inference strategy. In order to evaluate suggested method in this paper, we perform experiment with LUBM set which is formal data to evaluate ontology inference and search speed. In case LUBM6000, the RDFS inference technique based on meta table had required 13.75 minutes(inferring 1,042 triples per second) to conduct total inference, whereas the method applying the cloud computing memory had needed 7.24 minutes(inferring 1,979 triples per second) showing its speed twice faster.


Search




Journal of KIISE

  • ISSN : 2383-630X(Print)
  • ISSN : 2383-6296(Electronic)
  • KCI Accredited Journal

Editorial Office

  • Tel. +82-2-588-9240
  • Fax. +82-2-521-1352
  • E-mail. chwoo@kiise.or.kr