TY - GEN
T1 - IntelliPupil
T2 - 17th IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2018
AU - Liu, Chang
AU - Plopski, Alexander
AU - Kiyokawa, Kiyoshi
AU - Ratsamee, Photchara
AU - Orlosky, Jason
N1 - Funding Information:
The authors would like to thank all the participants for their time and effort during the experiments. This work was funded in part by the United States Department of the Navy, Office of Naval Research, Grant #N62909-18-1-2036, the Japan Society for the Promotion of Science under awards Grant-in-Aid for Scientific Research (B) #15H02738 and Grant-in-Aid for Young Scientists (B) #17K12726. We would also like to thank Haruo Takemura, Tomohiro Mashita, and Yuki Uranishi for their advising and support.
Publisher Copyright:
© 2018 IEEE.
PY - 2019/1/15
Y1 - 2019/1/15
N2 - In practical use of optical see-through head-mounted displays, users often have to adjust the brightness of virtual content to ensure that it is at the optimal level. Automatic adjustment is still a challenging problem, largely due to the bidirectional nature of the structure of the human eye, complexity of real world lighting, and user perception. Allowing the right amount of light to pass through to the retina requires a constant balance of incoming light from the real world, additional light from the virtual image, pupil contraction, and feedback from the user. While some automatic light adjustment methods exist, none have completely tackled this complex input-output system. As a step towards overcoming this issue, we introduce IntelliPupil, an approach that uses eye tracking to properly modulate augmentation lighting for a variety of lighting conditions and real scenes. We first take the data from a small form factor light sensor and changes in pupil diameter from an eye tracking camera as passive inputs. This data is coupled with user-controlled brightness selections, allowing us to fit a brightness model to user preference using a feed-forward neural network. Using a small amount of training data, both scene luminance and pupil size are used as inputs into the neural network, which can then automatically adjust to a user's personal brightness preferences in real time. Experiments in a high dynamic range AR scenario with varied lighting show that pupil size is just as important as environment light for optimizing brightness and that our system outperforms linear models.
AB - In practical use of optical see-through head-mounted displays, users often have to adjust the brightness of virtual content to ensure that it is at the optimal level. Automatic adjustment is still a challenging problem, largely due to the bidirectional nature of the structure of the human eye, complexity of real world lighting, and user perception. Allowing the right amount of light to pass through to the retina requires a constant balance of incoming light from the real world, additional light from the virtual image, pupil contraction, and feedback from the user. While some automatic light adjustment methods exist, none have completely tackled this complex input-output system. As a step towards overcoming this issue, we introduce IntelliPupil, an approach that uses eye tracking to properly modulate augmentation lighting for a variety of lighting conditions and real scenes. We first take the data from a small form factor light sensor and changes in pupil diameter from an eye tracking camera as passive inputs. This data is coupled with user-controlled brightness selections, allowing us to fit a brightness model to user preference using a feed-forward neural network. Using a small amount of training data, both scene luminance and pupil size are used as inputs into the neural network, which can then automatically adjust to a user's personal brightness preferences in real time. Experiments in a high dynamic range AR scenario with varied lighting show that pupil size is just as important as environment light for optimizing brightness and that our system outperforms linear models.
KW - augmented reality
KW - eye tracking
KW - Head-mounted displays
KW - lighting adjustment
KW - optical see-through
UR - http://www.scopus.com/inward/record.url?scp=85062167699&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85062167699&partnerID=8YFLogxK
U2 - 10.1109/ISMAR.2018.00037
DO - 10.1109/ISMAR.2018.00037
M3 - Conference contribution
AN - SCOPUS:85062167699
T3 - Proceedings of the 2018 IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2018
SP - 98
EP - 104
BT - Proceedings of the 2018 IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2018
A2 - Regenbrecht, Holger
A2 - Grubert, Jens
A2 - Chu, David
A2 - Gabbard, Joseph L.
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 16 October 2018 through 20 October 2018
ER -