Search : [ keyword: Human-Computer Interaction ] (5)

Real-time Multimodal Audio-to-Tactile Conversion System for Playing or Watching Mobile Shooting Games

Minjae Mun, Gyeore Yun, Chaeyong Park, Seungmoon Choi

http://doi.org/10.5626/JOK.2023.50.3.228

This study presents a real-time multimodal audio-to-tactile conversion system for improving user experiences when users play or watch first-person shooting games with a mobile device. The system detects whether sounds from the mobile devices are appropriate to provide haptic feedback in real-time and provides vibrotactile feedback, mainly used for conventional haptic feedback, and impact effects of short and strong force as well. To this end, we confirmed the suitability of the impact haptic feedback compared to the vibrotactile feedback for shooting games. We implemented two types of impulsive sound detectors using psychoacoustic features and a support vector machine. We found that our detectors outperformed the one from a previous study. Lastly, we conduct a user study to evaluate our system. Results showed that our system could significantly improve user experiences.

Estimation of Finger Motion using Transient EMG Signals

Jin Won Park, Kae Won Choi

http://doi.org/10.5626/JOK.2022.49.2.157

In this paper, we propose a deep learning model for estimating finger movements based on EMG signals. We have also evaluated and analyzed the accuracy of the model. We have applied the U-Net structure, which is widely used in medical image analysis, to our model. In general, U-Net is mainly used for processing of two-dimensional images. However, in this paper, 8-channel one-dimensional time series EMG data is used as inputs, and information about finger movement is obtained as results. We have acquired the data set consisting of 8,000 motions, which is divided into the training and evaluation data sets. The accuracy of the prediction of our model is about 89.32%.

Multimodal Haptic Rendering for Interactive VR Sports Applications

Minjae Mun, Seungjae Oh, Chaeyong Park, Seungmoon Choi

http://doi.org/10.5626/JOK.2022.49.2.97

This study explores how to deliver realistic haptic sensations for virtual collision events in virtual reality (VR). For this purpose, we implemented a multifaceted haptic device that produced both vibration and impact and designed a haptic rendering method combining the simulated interactions of a physics engine and the collision data of real objects. We also designed a virtual simulation of three sports activities, billiards, ping-pong, and tennis, in which a user could interact with virtual objects having different material properties. We performed a user study to evaluate the subjective quality of the haptic feedback from three rendering conditions, vibration, impact, and multimodal of combining both, and compared it to real haptic sensations. The results suggested that each rendering condition had different perceptual characteristics. Therefore, the addition of a haptic modality can broaden the dynamic range of virtual collisions.

A Dynamic Gesture Recognition System based on Trajectory Data of the Motion-sphere

Jaeyeong Ryu, Adithya B, Ashok Kumar Patil, Youngho Chai

http://doi.org/10.5626/JOK.2021.48.7.781

Recently, dynamic gesture recognition technology, which belongs to human-computer interaction (HCI), has received much attention. This is because the interface configuration for utilizing the system is simple and it is possible to communicate quickly. In this paper, we used a new input data format for the dynamic gesture recognition system and conducted research to improve the recognition accuracy. In the existing dynamic gesture recognition system, the position data and the rotation data of the joint are mainly used. In the proposed system, motion-sphere trajectory data are used. Motion-sphere expresses motion intuitively as a technique for visualizing movement. In the motion-sphere, the expression is composed of the trajectory and twist angle. In this paper, the trajectory of the motion-sphere is used as input data of the dynamic gesture recognition system. The validity of the trajectory data used is verified through the dynamic gesture recognition accuracy comparison. In the experiment, we experimented on two cases. The first cases were conducted by using measured quaternion data. The other experiments used open motion data. Both experiments conducted cognitive accuracy tests, and each experiment yielded high cognitive accuracy.

Comparing Initiating and Responding Joint Attention as a Social Learning Mechanism : A Study Using Human-Avatar Head/Hand Interaction

Mingyu Kim, So-Yeon Kim, Kwanguk Kim

http://doi.org/

Joint Attention (JA) has been known to play a key role in human social learning. However, relative impact of different interaction types has yet to be rigorously examined because of limitation of existing methodologies to simulate human-to-human interaction. In the present study, we designed a new JA paradigm with emulating human-avatar interaction and virtual reality technologies, and tested the paradigm in two experiments with healthy adults. Our results indicated that initiating JA (IJA) condition was more effective than responding JA (RJA) condition for social learning in both head and hand interactions. Moreover, the hand interaction involved better information processing than the head interaction. The implication of the results, the validity of the new paradigm, and limitations of this study were discussed.


Search




Journal of KIISE

  • ISSN : 2383-630X(Print)
  • ISSN : 2383-6296(Electronic)
  • KCI Accredited Journal

Editorial Office

  • Tel. +82-2-588-9240
  • Fax. +82-2-521-1352
  • E-mail. chwoo@kiise.or.kr