M3I in a pedestrian navigation & exploration system
conference contribution
posted on 2023-05-23, 08:40authored byWasinger, R, Stahl, C, Krueger, A
In this paper, we describe a near-complete Pocket PC implementation of a Mobile Multi-Modal Interaction (M3I) platform for pedestrian navigation. The platform is designed to easily support indoor and outdoor navigation tasks, and uses the combination of several modalities for presentation output and user input. Whereas 2D/3D-graphics and synthesized speech are used to present useful information on routes and places, fused input from embedded speech and gesture recognition engines allow for situated user interaction.
History
Publication title
Proceedings of Mobile HCI 2003
Editors
L Chittaro
Pagination
481-485
Department/School
School of Information and Communication Technology
Publisher
Springer
Place of publication
Germany
Event title
Fifth International Symposium on Human Computer Interaction with Mobile Devices (Mobile HCI)