TY - GEN
T1 - Approximated Match Swiping
T2 - 2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct, ISMAR-Adjunct 2023
AU - Mansour, Adam
AU - Orlosky, Jason
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023
Y1 - 2023
N2 - In augmented and virtual reality, hands-free text input is still a significant challenge. Though some virtual keyboard systems exist, they often require the use of controllers, suffer from the Midas touch problem, or require significant activation times to allow for the selection of keys. To help address this problem, we present a system called Approximated Match Swiping (AMSwipe for short), a pure gaze-based text input system that allows for efficient typing in virtual environments. This system builds on prior eye-based path/swipe systems for word recognition and selection in which users look at the keys on a virtual keyboard in succession to trace out a word path. Unlike other systems such as reverse-crossing, our method makes use of the spacebar to start or stop input, and the user can chain words together with additional brief glances at the spacebar, allowing for faster transitions between words. This system works for virtual and augmented reality as well as desktop and mobile systems. In an experiment testing our design alongside other well-known input systems, participants achieved an average raw typing speed of 7.81 words per minute (WPM) with under five minutes of practice, with an expert user reaching an average of 23 WPM. We compared our technique with two others, including Dwell and reverse-crossing, which had similar novice typing speeds of 8.19 WPM and 6.40 WPM, respectively.
AB - In augmented and virtual reality, hands-free text input is still a significant challenge. Though some virtual keyboard systems exist, they often require the use of controllers, suffer from the Midas touch problem, or require significant activation times to allow for the selection of keys. To help address this problem, we present a system called Approximated Match Swiping (AMSwipe for short), a pure gaze-based text input system that allows for efficient typing in virtual environments. This system builds on prior eye-based path/swipe systems for word recognition and selection in which users look at the keys on a virtual keyboard in succession to trace out a word path. Unlike other systems such as reverse-crossing, our method makes use of the spacebar to start or stop input, and the user can chain words together with additional brief glances at the spacebar, allowing for faster transitions between words. This system works for virtual and augmented reality as well as desktop and mobile systems. In an experiment testing our design alongside other well-known input systems, participants achieved an average raw typing speed of 7.81 words per minute (WPM) with under five minutes of practice, with an expert user reaching an average of 23 WPM. We compared our technique with two others, including Dwell and reverse-crossing, which had similar novice typing speeds of 8.19 WPM and 6.40 WPM, respectively.
KW - augmented reality
KW - eye tracking
KW - text input
KW - typing
KW - virtual reality
KW - xr
UR - http://www.scopus.com/inward/record.url?scp=85180369857&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85180369857&partnerID=8YFLogxK
U2 - 10.1109/ISMAR-Adjunct60411.2023.00037
DO - 10.1109/ISMAR-Adjunct60411.2023.00037
M3 - Conference contribution
AN - SCOPUS:85180369857
T3 - Proceedings - 2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct, ISMAR-Adjunct 2023
SP - 141
EP - 145
BT - Proceedings - 2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct, ISMAR-Adjunct 2023
A2 - Bruder, Gerd
A2 - Olivier, Anne-Helene
A2 - Cunningham, Andrew
A2 - Peng, Evan Yifan
A2 - Grubert, Jens
A2 - Williams, Ian
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 16 October 2023 through 20 October 2023
ER -