TY - JOUR T1 - Elastic Multiple Parametric Exponential Linear Units for Convolutional Neural Networks AU - Kim, Daeho AU - Kim, Jaeil JO - Journal of KIISE, JOK PY - 2019 DA - 2019/1/14 DO - 10.5626/JOK.2019.46.5.469 KW - activation function KW - convolutional neural network KW - image classification KW - deep learning AB - Activation function plays a major role in determining the depth and non-linearity of neural networks. Since the introduction of Rectified Linear Units for deep neural networks, many variants have been proposed. For example, Exponential Linear Units (ELU) leads to faster learning as pushing the mean of the activations closer to zero, and Elastic Rectified Linear Units (EReLU) changes the slope randomly for better model generalization. In this paper, we propose Elastic Multiple Parametric Exponential Linear Units (EMPELU) as a generalized form of ELU and EReLU. EMPELU changes the slope for the positive part of the function argument randomly within a moderate range during training, and the negative part can be dealt with various types of activation functions by its parameter learning. EMPELU improved the accuracy and generalization performance of convolutional neural networks in the object classification task (CIFAR-10/100), more than well-known activation functions.