TY - GEN
T1 - Framework for bimanual inter-device interactions
AU - Roudaki, Ali
AU - Kong, Jun
AU - Walia, Gursimran
N1 - Publisher Copyright:
© 2014 by Knowledge Systems Institute Graduate School.
PY - 2014
Y1 - 2014
N2 - A shared interactive display (e.g., a tabletop) provides a large space for collaborative interactions. However, a public display lacks a private space for accessing sensitive information. On the other hand, a mobile device offers a private display and a variety of modalities for personal applications, but it is limited by a small screen. We have developed a framework that supports fluid and seamless interactions among a tabletop and multiple mobile devices. This framework can continuously track each user’s action (e.g., hand movements or gestures) on top of a tabletop and then automatically generate a unique personal interface on an associated mobile device. This type of inter-device interactions integrates a collaborative workspace (i.e., a tabletop) and a private area (i.e., a mobile device) with multimodal feedback. To support this interaction style, an event-driven architecture is applied to implement the framework on the Microsoft PixelSense tabletop. This framework hides the details of user tracking and inter-device communications. Thus, interface designers can focus on the development of domainspecific interactions by mapping user’s actions on a tabletop to a personal interface on his/her mobile device. A user study compared our interaction style with a standard tabletop interface and justified the usability of the proposed interaction.
AB - A shared interactive display (e.g., a tabletop) provides a large space for collaborative interactions. However, a public display lacks a private space for accessing sensitive information. On the other hand, a mobile device offers a private display and a variety of modalities for personal applications, but it is limited by a small screen. We have developed a framework that supports fluid and seamless interactions among a tabletop and multiple mobile devices. This framework can continuously track each user’s action (e.g., hand movements or gestures) on top of a tabletop and then automatically generate a unique personal interface on an associated mobile device. This type of inter-device interactions integrates a collaborative workspace (i.e., a tabletop) and a private area (i.e., a mobile device) with multimodal feedback. To support this interaction style, an event-driven architecture is applied to implement the framework on the Microsoft PixelSense tabletop. This framework hides the details of user tracking and inter-device communications. Thus, interface designers can focus on the development of domainspecific interactions by mapping user’s actions on a tabletop to a personal interface on his/her mobile device. A user study compared our interaction style with a standard tabletop interface and justified the usability of the proposed interaction.
KW - Bimanual interaction
KW - Human computer interaction
KW - Multimodal interface
KW - Tangible interface
UR - http://www.scopus.com/inward/record.url?scp=84923913794&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84923913794&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:84923913794
T3 - Proceedings: DMS 2014 - 20th International Conference on Distributed Multimedia Systems
SP - 113
EP - 120
BT - Proceedings
PB - Knowledge Systems Institute Graduate School
T2 - 20th International Conference on Distributed Multimedia Systems, DMS 2014
Y2 - 27 August 2014 through 29 August 2014
ER -