TY - JOUR T1 - Multi-Level Attention-Based Generation Model for Long-Term Conversation AU - Kim, Hongjin AU - Keum, Bitna AU - Huang, Jinxia AU - Kwon, Ohwoog AU - Kim, Harksoo JO - Journal of KIISE, JOK PY - 2025 DA - 2025/1/14 DO - 10.5626/JOK.2025.52.2.117 KW - dialogue system KW - long-term conversation KW - generation model KW - open-domain dialogue AB - Research into developing more human-like conversational models is actively underway, utilizing persona memory to generate responses. Many existing studies employ a separate retrieval model to identify relevant personas from memory, which can slow down the overall system and make it cumbersome. Additionally, these studies primarily focused on ability to respond by reflecting a persona well. However, the ability to determine the necessity of referencing a persona should precede this. Therefore, in this paper, we propose a model that does not use a retriever. Instead, the need to reference memory was determined through multi-level attention operations within the generation model itself. If a reference is deemed necessary, the response reflects the relevant persona; Otherwise, the response focuses on the conversational context. Experimental results confirm that our proposed model operates effectively in long-term conversations.