SURE-based À-Trous Wavelet Filter for Interactive Monte Carlo Rendering

Soomin Kim, Bochang Moon, Sung-Eui Yoon

http://doi.org/

Monte Carlo ray tracing has been widely used for simulating a diverse set of photorealistic effects. However, this technique typically produces noise when insufficient numbers of samples are used. As the number of samples allocated per pixel is increased, the rendered images converge. However, this approach of generating sufficient numbers of samples, requires prohibitive rendering time. To solve this problem, image filtering can be applied to rendered images, by filtering the noisy image rendered using low sample counts and acquiring smoothed images, instead of naively generating additional rays. In this paper, we proposed a Stein"s Unbiased Risk Estimator (SURE) based À-Trous wavelet to filter the noise in rendered images in a near-interactive rate. Based on SURE, we can estimate filtering errors associated with À-Trous wavelet, and identify wavelet coefficients reducing filtering errors. Our approach showed improvement, up to 6:1, over the original À-Trous filter on various regions in the image, while maintaining a minor computational overhead. We have integrated our propsed filtering method with the recent interactive ray tracing system, Embree, and demonstrated its benefits.

In-Memory File System Backed by Cloud Storage Services as Permanent Storages

Kyungjun Lee, Jiwon Kim, Sungtae Ryu, Hwansoo Han

http://doi.org/

As network technology advances, a larger number of devices are connected through the Internet. Recently, cloud storage services are gaining popularity, as they are convenient to access anytime and anywhere. Among cloud storage services, object storage is the representative one due to their characteristics of low cost, high availability, and high durability. One limitation of object storage services is that they can access data on the cloud only through the HTTP-based RESTful APIs. In our work, we resolve this limitation with the in-memory file system which provides a POSIX interface to the file system users and communicates with cloud object storages with RESTful APIs. In particular, our flush mechanism is compatible with existing file systems, as it is based on the swap mechanism of the Linux kernel. Our in-memory file system backed by cloud storage reduces the performance overheads and shows a better performance than S3QL by 57% in write operations. It also shows a comparable performance to tmpfs in read operations.

Automatic Recognition Algorithm of Unknown Ships on Radar

Hyun Chul Jung, Soung Woong Yoon, Sang Hoon Lee

http://doi.org/

Seeking and recognizing maritime targets are very important tasks for maritime safety. While searching for maritime targets using radar is possible, recognition is conducted without automatic identification system, radio communicator or visibility. If this recognition is not feasible, radar operator must tediously recognize maritime targets using movement features on radar base on know-how and experience. In this paper, to support the radar operator’s mission of continuous observation, we propose an algorithm for automatic recognition of an unknown ship using movement features on radar and a method of detecting potential ship related accidents. We extract features from contact range, course and speed of four types of vessels and evaluate the recognition accuracy using SVM and suggest a method of detecting potential ship related accidents through the algorithm. Experimentally, the resulting recognition accuracy is found to be more than 90% and presents the possibility of detecting potential ship related accidents through the algorithm using information of MV Sewol. This method is an effective way to support operator’s know-how and experience in various circumstances and assist in detecting potential ship related accidents.

Successful Win-Win Requirements Negotiation Method using Game Theoretic Approach

Kwan Hong Lee, Seok-Won Lee

http://doi.org/

With changing software industry structure, the emerging concept called Software Ecosystems (SECO) has various challenges that software engineers have to overcome. In marketdriven software product development, they should have the capability to offer high value products to their own business and their customers in order to being competitive. Each stakeholder’s perspectives and interests should be reconciled in terms of requirements so that engineers can offer high value products through requirements selection. Existing works have just mentioned the need of requirements negotiation between stakeholders without proposing detailed guidelines or practice. In this work, a systematic Requirements Negotiation process is proposed to resolve conflicts of interests of stakeholders in SECO. The interests of stakeholders are analyzed based on goal-based requirements engineering. The rationale of requirements conflict is structured for management. A stepwise requirements negotiation process aims at resolving requirements conflict by applying game theory concepts based on self-interested behaviors of stakeholders.

Automatic Segmentation of Femoral Cartilage in Knee MR Images using Multi-atlas-based Locally-weighted Voting

Hyeun A Kim, Hyeonjin Kim, Han Sang Lee, Helen Hong

http://doi.org/

In this paper, we propose an automated segmentation method of femoral cartilage in knee MR images using multi-atlas-based locally-weighted voting. The proposed method involves two steps. First, to utilize the shape information to show that the femoral cartilage is attached to a femur, the femur is segmented via volume and object-based locally-weighted voting and narrow-band region growing. Second, the object-based affine transformation of the femur is applied to the registration of femoral cartilage, and the femoral cartilage is segmented via multi-atlas shape-based locally-weighted voting. To evaluate the performance of the proposed method, we compared the segmentation results of majority voting method, intensity-based locally-weighted voting method, and the proposed method with manual segmentation results defined by expert. In our experimental results, the newly proposed method avoids a leakage into the neighboring regions having similar intensity of femoral cartilage, and shows improved segmentation accuracy.

Image Caption Generation using Recurrent Neural Network

Changki Lee

http://doi.org/

Automatic generation of captions for an image is a very difficult task, due to the necessity of computer vision and natural language processing technologies. However, this task has many important applications, such as early childhood education, image retrieval, and navigation for blind. In this paper, we describe a Recurrent Neural Network (RNN) model for generating image captions, which takes image features extracted from a Convolutional Neural Network (CNN). We demonstrate that our models produce state of the art results in image caption generation experiments on the Flickr 8K, Flickr 30K, and MS COCO datasets.

Test Case Grouping and Filtering for Better Performance of Spectrum-based Fault Localization

Jeongho Kim, Eunseok Lee

http://doi.org/

Spectrum-based fault localization (SFL) method assigns a suspicious ratio. The statement is strongly affected by a failed test case compared to a passed test case. A failed test case assigns a suspicious ratio while a passed test case reduces some parts of assigned suspicious ratio. In the absence of a failed test case, it is impossible to localize the fault. Thus, a failed test case is very important for fault localization. However, spectrum-based fault localization has difficulty in reflecting the unique characteristics of a failed test because a failed test case and a passed test case are input at the same time to calculate a suspicious ratio. This paper supplements for this limitation and suggests a test case grouping method for more accurate fault localization. In addition, this paper suggested a filtering method considering test efficiency and verified the effectiveness by applying 65 algorithms. In 90 % of whole methods, the accuracy was improved by 13% and the effectiveness was improved by 72% based on EXAM score.

Dynamic Discovery of Geographically Cohesive Services in Internet of Things Environments

KyeongDeok Baek, MinHyeop Kim, InYoung Ko

http://doi.org/

In Internet of Things (IoT) environments, users are required to search for IoT devices necessary to access services for accomplishing their tasks. As IoT technologies advance, a user task will utilize various types of IoT-based services that are deployed in an IoT environment. Therefore, to accomplish a user task effectively, the services that utilize IoT devices need to be found in a certain geographical region. In addition, the service discovery needs to be accomplished in a stable manner while considering dynamically changing IoT environments. To deal with these issues, we propose two service discovery methods that consider geographic cohesiveness of services in IoT environments. We compare the effectiveness of the proposed methods against a traditional service discovery algorithm that does not consider geographic cohesiveness.

Inverse Document Frequency-Based Word Embedding of Unseen Words for Question Answering Systems

Wooin Lee, Gwangho Song, Kyuseok Shim

http://doi.org/

Question answering system (QA system) is a system that finds an actual answer to the question posed by a user, whereas a typical search engine would only find the links to the relevant documents. Recent works related to the open domain QA systems are receiving much attention in the fields of natural language processing, artificial intelligence, and data mining. However, the prior works on QA systems simply replace all words that are not in the training data with a single token, even though such unseen words are likely to play crucial roles in differentiating the candidate answers from the actual answers. In this paper, we propose a method to compute vectors of such unseen words by taking into account the context in which the words have occurred. Next, we also propose a model which utilizes inverse document frequencies (IDF) to efficiently process unseen words by expanding the system’s vocabulary. Finally, we validate that the proposed method and model improve the performance of a QA system through experiments.

TeT: Distributed Tera-Scale Tensor Generator

ByungSoo Jeon, JungWoo Lee, U Kang

http://doi.org/

A tensor is a multi-dimensional array that represents many data such as (user, user, time) in the social network system. A tensor generator is an important tool for multi-dimensional data mining research with various applications including simulation, multi-dimensional data modeling/understanding, and sampling/extrapolation. However, existing tensor generators cannot generate sparse tensors like real-world tensors that obey power law. In addition, they have limitations such as tensor sizes that can be processed and additional time required to upload generated tensor to distributed systems for further analysis. In this study, we propose TeT, a distributed tera-scale tensor generator to solve these problems. TeT generates sparse random tensor as well as sparse R-MAT and Kronecker tensor without any limitation on tensor sizes. In addition, a TeT-generated tensor is immediately ready for further tensor analysis on the same distributed system. The careful design of TeT facilitates nearly linear scalability on the number of machines.

Modified RTT Estimation Scheme for Improving Throughput of Delay-based TCP in Wireless Networks

Hyunsoo Kang, Jiwoo Park, Kwangsue Chung

http://doi.org/

In a wireless network, TCP causes the performance degradation because of mistaking packet loss, which is caused by characteristics of wireless link and throughput oscillation due to change of devices connected on a limited bandwidth. Delay based TCP is not affected by packet loss because it controls window size by using the RTT. Therefore, it can solve the problem of unnecessary degradation of the rate caused by misunderstanding reason of packet loss. In this paper, we propose an algorithm for improving the remaining problems by using delay based TCP. The proposed scheme can change throughput adaptively by adding the RTT, which rapidly reflects the network conditions to BaseRTT. It changes the weight of RTT and the increases and decreases window size based on the remaining amount of the buffer. The simulation indicated that proposed scheme can alleviate the throughput oscillation problem, as compared to the legacy TCP Vegas.

Answer Snippet Retrieval for Question Answering of Medical Documents

Hyeon-gu Lee, Minkyoung Kim, Harksoo Kim

http://doi.org/

With the explosive increase in the number of online medical documents, the demand for question-answering systems is increasing. Recently, question-answering models based on machine learning have shown high performances in various domains. However, many question-answering models within the medical domain are still based on information retrieval techniques because of sparseness of training data. Based on various information retrieval techniques, we propose an answer snippet retrieval model for question-answering systems of medical documents. The proposed model first searches candidate answer sentences from medical documents using a cluster-based retrieval technique. Then, it generates reliable answer snippets using a re-ranking model of the candidate answer sentences based on various sentence retrieval techniques. In the experiments with BioASQ 4b, the proposed model showed better performances (MAP of 0.0604) than the previous models.

Constant-Size Ciphertext-Policy Attribute-Based Data Access and Outsourceable Decryption Scheme

Changhee Hahn, Junbeom Hur

http://doi.org/

Sharing data by multiple users on the public storage, e.g., the cloud, is considered to be efficient because the cloud provides on-demand computing service at anytime and anywhere. Secure data sharing is achieved by fine-grained access control. Existing symmetric and public key encryption schemes are not suitable for secure data sharing because they support 1-to-1 relationship between a ciphertext and a secret key. Attribute based encryption supports fine-grained access control, however it incurs linearly increasing ciphertexts as the number of attributes increases. Additionally, the decryption process has high computational cost so that it is not applicable in case of resource-constrained environments. In this study, we propose an efficient attribute-based secure data sharing scheme with outsourceable decryption. The proposed scheme guarantees constant-size ciphertexts irrespective of the number of attributes. In case of static attributes, the computation cost to the user is reduced by delegating approximately 95.3% of decryption operations to the more powerful storage systems, whereas 72.3% of decryption operations are outsourced in terms of dynamic attributes.


Search




Journal of KIISE

  • ISSN : 2383-630X(Print)
  • ISSN : 2383-6296(Electronic)
  • KCI Accredited Journal

Editorial Office

  • Tel. +82-2-588-9240
  • Fax. +82-2-521-1352
  • E-mail. chwoo@kiise.or.kr