TY - JOUR T1 - Variational Recurrent Neural Networks with Relational Memory Core Architectures AU - Kim, Geon-Hyeong AU - Seo, Seokin AU - Kim, Shinhyung AU - Kim, Kee-Eung JO - Journal of KIISE, JOK PY - 2020 DA - 2020/1/14 DO - 10.5626/JOK.2020.47.2.189 KW - recurrent neural networks KW - relational memory core KW - variational inference AB - Recurrent neural networks are designed to model sequential data and learn generative models for sequential data. Therefore, VRNNs (variational recurrent neural networks), which incorporate the elements of VAE (variational autoencoder) into RNN (recurrent neural network), represent complex data distribution. Meanwhile, the relationship between inputs in each sequence has been attributed to RMC (relational memory core), which introduces self-attention-based memory architecture into RNN memory cell. In this paper, we propose a VRMC (variational relation memory core) model to introduce a relational memory core architecture into VRNN. Further, by investigating the music data generated, we showed that VRMC was better than in previous studies and more effective for modeling sequential data.