Integrating intra and extra gestures into a mobile and multimodal shopping assistant
conference contribution
posted on 2023-05-23, 08:39authored byWasinger, R, Krueger, A, Jacobs, O
Accompanying the rise of mobile and pervasive computing, applications now need to adapt to their surrounding environments and provide users with information in the environment in an easy and natural manner. In this paper we describe a user interface that integrates multimodal input on a handheld device with external gestures performed with real world artifacts. The described approach extends reference resolution based on speech, handwriting and gesture to that of real world objects that users may hold in their hands. We discuss the varied interaction channels available to users that arise from mixing and matching input modalities on the mobile device with actions performed in the environment. We also discuss the underlying components required in handling these extended multimodal interactions and present an implementation of our ideas in a demonstrator called the Mobile ShopAssist. This demonstrator is then used as the basis for a recent usability study that we describe on user interaction within mobile contexts.
History
Publication title
Proceedings of the 3rd International Conference on Pervasive Computing (Pervasive)
Editors
HW Gellersen
Pagination
297-314
ISBN
3-540-26008-0
Department/School
School of Information and Communication Technology
Publisher
Springer
Place of publication
Berlin, Germany
Event title
3rd International Conference on Pervasive Computing (Pervasive)