Search : [ keyword: Inference ] (16)

Distributed Assumption-Based Truth Maintenance System for Scalable Reasoning

Batselem Jagvaral, Young-Tack Park

http://doi.org/

Assumption-based truth maintenance system (ATMS) is a tool that maintains the reasoning process of inference engine. It also supports non-monotonic reasoning based on dependency-directed backtracking. Bookkeeping all the reasoning processes allows it to quickly check and retract beliefs and efficiently provide solutions for problems with large search space. However, the amount of data has been exponentially grown recently, making it impossible to use a single machine for solving large-scale problems. The maintaining process for solving such problems can lead to high computation cost due to large memory overhead. To overcome this drawback, this paper presents an approach towards incrementally maintaining the reasoning process of inference engine on cluster using Spark. It maintains data dependencies such as assumption, label, environment and justification on a cluster of machines in parallel and efficiently updates changes in a large amount of inferred datasets. We deployed the proposed ATMS on a cluster with 5 machines, conducted OWL/RDFS reasoning over University benchmark data (LUBM) and evaluated our system in terms of its performance and functionalities such as assertion, explanation and retraction. In our experiments, the proposed system performed the operations in a reasonably short period of time for over 80GB inferred LUBM2000 dataset.

A Design of Effective Inference Methods and Their Application Guidelines for Supporting Various Medical Analytics Schemes

Moon Kwon Kim, Hyun Jung La, Soo Dong Kim

http://doi.org/

As a variety of personal medical devices appear, it is possible to acquire a large number of diverse medical contexts from the devices. There have been efforts to analyze the medical contexts via software applications. In this paper, we propose a generic model of medical analytics schemes that are used by medical experts, identify inference methods for realizing each medical analytics scheme, and present guidelines for applying the inference methods to the medical analytics schemes. Additionally, we develop a PoC inference system and analyze real medical contexts to diagnose relevant diseases so that we can validate the feasibility and effectiveness of the proposed medical analytics schemes and guidelines of applying inference methods.

A Label Inference Algorithm Considering Vertex Importance in Semi-Supervised Learning

Byonghwa Oh, Jihoon Yang, Hyun-Jin Lee

http://doi.org/

Semi-supervised learning is an area in machine learning that employs both labeled and unlabeled data in order to train a model and has the potential to improve prediction performance compared to supervised learning. Graph-based semi-supervised learning has recently come into focus with two phases: graph construction, which converts the input data into a graph, and label inference, which predicts the appropriate labels for unlabeled data using the constructed graph. The inference is based on the smoothness assumption feature of semi-supervised learning. In this study, we propose an enhanced label inference algorithm by incorporating the importance of each vertex. In addition, we prove the convergence of the suggested algorithm and verify its excellence.

Automatic Detection of Off-topic Documents using ConceptNet and Essay Prompt in Automated English Essay Scoring

Kong Joo Lee, Gyoung Ho Lee

http://doi.org/

This work presents a new method that can predict, without the use of training data, whether an input essay is written on a given topic. ConceptNet is a common-sense knowledge base that is generated automatically from sentences that are extracted from a variety of document types. An essay prompt is the topic that an essay should be written about. The method that is proposed in this paper uses ConceptNet and an essay prompt to decide whether or not an input essay is off-topic. We introduce a way to find the shortest path between two nodes on ConceptNet, as well as a way to calculate the semantic similarity between two nodes. Not only an essay prompt but also a student"s essay can be represented by concept nodes in ConceptNet. The semantic similarity between the concepts that represent an essay prompt and the other concepts that represent a student"s essay can be used for a calculation to rank “on-topicness” ; if a low ranking is derived, an essay is regarded as off-topic. We used eight different essay prompts and a student-essay collection for the performance evaluation, whereby our proposed method shows a performance that is better than those of the previous studies. As ConceptNet enables the conduction of a simple text inference, our new method looks very promising with respect to the design of an essay prompt for which a simple inference is required.

Scalable RDFS Reasoning Using the Graph Structure of In-Memory based Parallel Computing

MyungJoong Jeon, ChiSeoung So, Batselem Jagvaral, KangPil Kim, Jin Kim, JinYoung Hong, YoungTack Park

http://doi.org/

In recent years, there has been a growing interest in RDFS Inference to build a rich knowledge base. However, it is difficult to improve the inference performance with large data by using a single machine. Therefore, researchers are investigating the development of a RDFS inference engine for a distributed computing environment. However, the existing inference engines cannot process data in real-time, are difficult to implement, and are vulnerable to repetitive tasks. In order to overcome these problems, we propose a method to construct an in-memory distributed inference engine that uses a parallel graph structure. In general, the ontology based on a triple structure possesses a graph structure. Thus, it is intuitive to design a graph structure-based inference engine. Moreover, the RDFS inference rule can be implemented by utilizing the operator of the graph structure, and we can thus design the inference engine according to the graph structure, and not the structure of the data table. In this study, we evaluate the proposed inference engine by using the LUBM1000 and LUBM3000 data to test the speed of the inference. The results of our experiment indicate that the proposed in-memory distributed inference engine achieved a performance of about 10 times faster than an in-storage inference engine.

Robust Particle Filter Based Route Inference for Intelligent Personal Assistants on Smartphones

Haejung Baek, Young Tack Park

http://doi.org/

Much research has been conducted on location-based intelligent personal assistants that can understand a user"s intention by learning the user"s route model and then inferring the user"s destinations and routes using data of GPS and other sensors in a smartphone. The intelligence of the location-based personal assistant is contingent on the accuracy and efficiency of the real-time predictions of the user"s intended destinations and routes by processing movement information based on uncertain sensor data. We propose a robust particle filter based on Dynamic Bayesian Network model to infer the user"s routes. The proposed robust particle filter includes a particle generator to supplement the incorrect and incomplete sensor information, an efficient switching function and an weight function to reduce the computation complexity as well as a resampler to enhance the accuracy of the particles. The proposed method improves the accuracy and efficiency of determining a user"s routes and destinations.


Search




Journal of KIISE

  • ISSN : 2383-630X(Print)
  • ISSN : 2383-6296(Electronic)
  • KCI Accredited Journal

Editorial Office

  • Tel. +82-2-588-9240
  • Fax. +82-2-521-1352
  • E-mail. chwoo@kiise.or.kr