Search : [ author: 김대호 ] (2)

A Script Generation Method for Microservice Deployment in a Container Orchestration Environment

Daeho Kim, Donggyu Yun, Joonseok Park, Keunhyuk Yeom

http://doi.org/10.5626/JOK.2019.46.7.682

Container orchestration technology has been used to develop applications such as microservices and support distribution and management of container environments. Orchestration technology is appropriate for the resilient management of large-scale microservice applications because it enables the automated creation and distribution management of hundreds of containers in batches. However, when the existing monolithic application is converted into a container based on a microservice unit, the components necessary for distribution and management are manually mapped and defined. In this paper, we propose a method to automatically generate a template script to distribute and manage microservices in a container orchestration environment based on UML design data of existing monolithic application. In addition, in the case study, a template script was generated using the method described in Kubernetes, a container orchestration environment. The microservice was distributed and managed by executing the script.

Elastic Multiple Parametric Exponential Linear Units for Convolutional Neural Networks

Daeho Kim, Jaeil Kim

http://doi.org/10.5626/JOK.2019.46.5.469

Activation function plays a major role in determining the depth and non-linearity of neural networks. Since the introduction of Rectified Linear Units for deep neural networks, many variants have been proposed. For example, Exponential Linear Units (ELU) leads to faster learning as pushing the mean of the activations closer to zero, and Elastic Rectified Linear Units (EReLU) changes the slope randomly for better model generalization. In this paper, we propose Elastic Multiple Parametric Exponential Linear Units (EMPELU) as a generalized form of ELU and EReLU. EMPELU changes the slope for the positive part of the function argument randomly within a moderate range during training, and the negative part can be dealt with various types of activation functions by its parameter learning. EMPELU improved the accuracy and generalization performance of convolutional neural networks in the object classification task (CIFAR-10/100), more than well-known activation functions.


Search




Journal of KIISE

  • ISSN : 2383-630X(Print)
  • ISSN : 2383-6296(Electronic)
  • KCI Accredited Journal

Editorial Office

  • Tel. +82-2-588-9240
  • Fax. +82-2-521-1352
  • E-mail. chwoo@kiise.or.kr