APPLIED SCIENCES, cilt.14, sa.14, ss.6042-6072, 2024 (SCI-Expanded)
Our research systematically investigates the cognitive and emotional
processes revealed through eye movements within the context of virtual
reality (VR) environments. We assess the utility of eye-tracking data
for predicting emotional states in VR, employing explainable artificial
intelligence (XAI) to advance the interpretability and transparency of
our findings. Utilizing the VR Eyes: Emotions dataset (VREED) alongside
an extra trees classifier enhanced by SHapley Additive ExPlanations
(SHAP) and local interpretable model agnostic explanations (LIME), we
rigorously evaluate the importance of various eye-tracking metrics. Our
results identify significant correlations between metrics such as
saccades, micro-saccades, blinks, and fixations and specific emotional
states. The application of SHAP and LIME elucidates these relationships,
providing deeper insights into the emotional responses triggered by VR.
These findings suggest that variations in eye feature patterns serve as
indicators of heightened emotional arousal. Not only do these insights
advance our understanding of affective computing within VR, but they
also highlight the potential for developing more responsive VR systems
capable of adapting to user emotions in real-time. This research
contributes significantly to the fields of human-computer interaction
and psychological research, showcasing how XAI can bridge the gap
between complex machine-learning models and practical applications,
thereby facilitating the creation of reliable, user-sensitive VR
experiences. Future research may explore the integration of multiple
physiological signals to enhance emotion detection and interactive
dynamics in VR.