Digital Library[ Search Result ]
Requirements Elicitation and Specification Method for the Development of Adaptive User Interface
Many studies have addressed ‘Adaptive User Interface (AUI)’, wherein the user interface changes in accordance with the situation and the environment of each user at runtime. Nevertheless, previous papers for AUI barely reflected the viewpoints from requirements engineering field, since most of them focused on proposing the architecture and design. In this study, we outline AUI with the perspective of requirements engineering and propose the requirements elicitation and specification method based on concepts which have been researched in the area of self-adaptive system. Step by step, we first redefine and reinterpret the well-known concepts of self-adaptive software, after which the AUI requirements are elicited and specified. Finally, we illustrate a case study, which demonstrates the effectiveness of our method.
Quantified Lockscreen: Integration of Personalized Facial Expression Detection and Mobile Lockscreen application for Emotion Mining and Quantified Self
Sung Sil Kim, Junsoo Park, Woontack Woo
Lockscreen is one of the most frequently encountered interfaces by smartphone users. Although users perform unlocking actions every day, there are no benefits in using lockscreens apart from security and authentication purposes. In this paper, we replace the traditional lockscreen with an application that analyzes facial expressions in order to collect facial expression data and provide real-time feedback to users. To evaluate this concept, we have implemented Quantified Lockscreen application, supporting the following contributions of this paper: 1) an unobtrusive interface for collecting facial expression data and evaluating emotional patterns, 2) an improvement in accuracy of facial expression detection through a personalized machine learning process, and 3) an enhancement of the validity of emotion data through bidirectional, multi-channel and multi-input methodology.
A GUI State Comparison Technique for Effective Model-based Android GUI Testing
Youngmin Baek, Gwangui Hong, Doo-hwan Bae
Graphical user interface testing (GUI testing) techniques have been widely used to test the functionality of Android applications (apps) and to detect faults for verification of the reliability and usability of apps. To adequately test the behaviors of apps, a number of studies on model-based GUI testing techniques have been performed on Android apps. However, the effectiveness of model-based techniques greatly depends on the quality of the GUI model, because model-based GUI testing techniques generate test inputs based on this model. Therefore, in order to improve testing effectiveness in model-based techniques, accurate and efficient GUI model generation has to be achieved using an improved model generation technique with concrete definition of GUI states. For accurate and efficient generation of a GUI model and test inputs, this study suggests a hierarchical GUI state comparison technique and evaluates this technique through comparison with the existing model-based techniques, considering activities as GUI states. Our results show that the proposed technique outperforms existing approaches and has the potential to improve the performance of model-based GUI testing techniques for Android apps.
Search

Journal of KIISE
- ISSN : 2383-630X(Print)
- ISSN : 2383-6296(Electronic)
- KCI Accredited Journal
Editorial Office
- Tel. +82-2-588-9240
- Fax. +82-2-521-1352
- E-mail. chwoo@kiise.or.kr