Digital Library[ Search Result ]
Quantitative Analysis of Sequence-based Container Security Enhancement using a System Call Sequence Extraction Framework
Somin Song, Youyang Kim, Byungchul Tak
http://doi.org/10.5626/JOK.2023.50.11.913
Container escape is one of the most critical threats in containerized applications that share a host kernel. Attackers exploit kernel vulnerabilities through a series of manipulated system calls to achieve privilege escalation, which can lead to container escape. Seccomp is a security mechanism widely used in containers. It strengthens the level of isolation by filtering out unnecessary system call invocations. However, the filtering mechanism of Seccomp that blocks individual system calls has a fundamental limitation in that it can be vulnerable to attacks that use system calls allowed by the policy. Therefore, this study presents a hybrid analysis framework that combines static and dynamic analyses to extract system call sequences from exploit codes. Using this framework, we compared the security strength of an existing individual system call-based filtering mechanism and proposed a system call sequence-based filtering mechanism in terms of the number of blockable exploit codes using system call profiles for the same exploit codes. As a result, the proposed system call sequence-based filtering mechanism was able to increase the defense coverage from 63% to 98% compared to the existing individual system call-based filtering mechanism.
Using Vertical and Horizonal Hidden Vector of BERT, Attention-based Separated Transfer Learning Model for Dialog Response Selection
http://doi.org/10.5626/JOK.2021.48.1.119
The purpose of this paper is to create a dialog response selection system that accurately identifies the next utterance (one correct answer out of 100 candidates) of a given dialog based on data provided by DSTC. To this end, BERT was used; BERT can be used for multiple purposes and achieves high performance, but it is not easy to customize the model, and it is also difficult to transform the input data format for performance optimization. To address these issues, we propose an effective data augmentation method, and we also propose an independent transfer learning model that involves extracting contextual attention information (self-attention vector) from the BERT model. This made it possible to achieve a performance improvement of 22.85% over the previous value.
Search

Journal of KIISE
- ISSN : 2383-630X(Print)
- ISSN : 2383-6296(Electronic)
- KCI Accredited Journal
Editorial Office
- Tel. +82-2-588-9240
- Fax. +82-2-521-1352
- E-mail. chwoo@kiise.or.kr