TY - GEN
T1 - EyeShadows
T2 - 31st IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2024
AU - Orlosky, Jason
AU - Liu, Chang
AU - Sakamoto, Kenya
AU - Sidenmark, Ludwig
AU - Mansour, Adam
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - In eye-tracked augmented and virtual reality (AR/VR), instantaneous and accurate hands-free selection of virtual elements is still a significant challenge. Though other methods that involve gaze-coupled head movements or hovering can improve selection times in comparison to methods like gaze-dwell, they are either not instantaneous or have difficulty ensuring that the user's selection is deliberate. In this paper, we present EyeShadows, an eye gaze-based selection system that takes advantage of peripheral copies (shadows) of items that allow for quick selection and manipulation of an object or corresponding menus. This method is compatible with a variety of different selection tasks and controllable items, avoids the Midas touch problem, does not clutter the virtual environment, and is context sensitive. We have implemented and refined this selection tool for VR and AR, including testing with optical and video see-through (OST/VST) displays. Moreover, we demonstrate that this method can be used for a wide range of AR and VR applications, including manipulation of sliders or analog elements. We test its performance in VR against three other selection techniques, including dwell (baseline), an inertial reticle, and head-coupled selection. Results showed that selection with EyeShadows was significantly faster than dwell (baseline), outperforming in the select and search and select tasks by 29.8% and 15.7%, respectively, though error rates varied between tasks.
AB - In eye-tracked augmented and virtual reality (AR/VR), instantaneous and accurate hands-free selection of virtual elements is still a significant challenge. Though other methods that involve gaze-coupled head movements or hovering can improve selection times in comparison to methods like gaze-dwell, they are either not instantaneous or have difficulty ensuring that the user's selection is deliberate. In this paper, we present EyeShadows, an eye gaze-based selection system that takes advantage of peripheral copies (shadows) of items that allow for quick selection and manipulation of an object or corresponding menus. This method is compatible with a variety of different selection tasks and controllable items, avoids the Midas touch problem, does not clutter the virtual environment, and is context sensitive. We have implemented and refined this selection tool for VR and AR, including testing with optical and video see-through (OST/VST) displays. Moreover, we demonstrate that this method can be used for a wide range of AR and VR applications, including manipulation of sliders or analog elements. We test its performance in VR against three other selection techniques, including dwell (baseline), an inertial reticle, and head-coupled selection. Results showed that selection with EyeShadows was significantly faster than dwell (baseline), outperforming in the select and search and select tasks by 29.8% and 15.7%, respectively, though error rates varied between tasks.
KW - Augmented Reality
KW - Eye Tracking
KW - Hands-free
KW - Manipulation
KW - Selection
KW - Virtual Reality
UR - http://www.scopus.com/inward/record.url?scp=85191465628&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85191465628&partnerID=8YFLogxK
U2 - 10.1109/VR58804.2024.00088
DO - 10.1109/VR58804.2024.00088
M3 - Conference contribution
AN - SCOPUS:85191465628
T3 - Proceedings - 2024 IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2024
SP - 681
EP - 689
BT - Proceedings - 2024 IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2024
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 16 March 2024 through 21 March 2024
ER -