TY - GEN
T1 - Attention engagement and cognitive state analysis for augmented reality text display functions
AU - Toyama, Takumi
AU - Sonntag, Daniel
AU - Orlosky, Jason
AU - Kiyokawa, Kiyoshi
PY - 2015/3/18
Y1 - 2015/3/18
N2 - Human eye gaze has recently been used as an effective input interface for wearable displays. In this paper, we propose a gaze-based interaction framework for optical see-through displays. The proposed system can automatically judge whether a user is engaged with virtual content in the display or focused on the real environment and can determine his or her cognitive state. With these analytic capacities, we implement several proactive system functions including adaptive brightness, scrolling, messaging, notification, and highlighting, which would otherwise require manual interaction. The goal is to manage the relationship between virtual and real, creating a more cohesive and seamless experience for the user. We conduct user experiments including attention engagement and cognitive state analysis, such as reading detection and gaze position estimation in a wearable display towards the design of augmented reality text display applications. The results from the experiments show robustness of the attention engagement and cognitive state analysis methods. A majority of the experiment participants (8/12) stated the proactive system functions are beneficial. Copyrightc 2015 ACM.
AB - Human eye gaze has recently been used as an effective input interface for wearable displays. In this paper, we propose a gaze-based interaction framework for optical see-through displays. The proposed system can automatically judge whether a user is engaged with virtual content in the display or focused on the real environment and can determine his or her cognitive state. With these analytic capacities, we implement several proactive system functions including adaptive brightness, scrolling, messaging, notification, and highlighting, which would otherwise require manual interaction. The goal is to manage the relationship between virtual and real, creating a more cohesive and seamless experience for the user. We conduct user experiments including attention engagement and cognitive state analysis, such as reading detection and gaze position estimation in a wearable display towards the design of augmented reality text display applications. The results from the experiments show robustness of the attention engagement and cognitive state analysis methods. A majority of the experiment participants (8/12) stated the proactive system functions are beneficial. Copyrightc 2015 ACM.
KW - Attention engagement analysis
KW - Augmented reality
KW - Cognitive state analysis
KW - Eye tracking
KW - Gaze-based interface
UR - http://www.scopus.com/inward/record.url?scp=84939617308&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84939617308&partnerID=8YFLogxK
U2 - 10.1145/2678025.2701384
DO - 10.1145/2678025.2701384
M3 - Conference contribution
AN - SCOPUS:84939617308
T3 - International Conference on Intelligent User Interfaces, Proceedings IUI
SP - 322
EP - 332
BT - IUI 2015 - Proceedings of the 20th ACM International Conference on Intelligent User Interfaces
PB - Association for Computing Machinery
T2 - 20th ACM International Conference on Intelligent User Interfaces, IUI 2015
Y2 - 29 March 2015 through 1 April 2015
ER -