ABSTRACT
We present an innovative multi-modal interaction concept based on a human-centered design for control centers. The applied multi-layered hardware and software architecture directly supports the users in performing their lengthy monitoring and urgent alarm handling tasks. We combine visual cues, gestural interaction, audio information, and intelligent data processing into a single, universal interface. We further realized the presented concept by a prototypical implementation, using state-of-the-art interaction technologies. Moreover, the paper critically reflects on the long-term applicability of the proposed interfaces and outlines immediate plans for their evaluation. Finally, we indicate several research challenges regarding the real-world application of the presented interaction concepts.
- C. Amon and F. Fuhrmann. Evaluation of the spatial resolution accuracy of the face tracking system for kinect for windows v1 and v2. In Proceedings of the 6th Congress of the Alps Adria Acoustics Association, 2014.Google Scholar
- F. Fuhrmann. Evaluation of a transaural audio system using parametric loudspeaker arrays. In Proceedings of the 6th Congress of the Alps Adria Acoustics Association, 2014.Google Scholar
- F. Fuhrmann, K. Dobbler, F. Pokorny, and F. Graf. A modular system for improving speech intelligibility under extreme acoustic conditions: subjective evaluation of parameter influence. In Forum Acusticum, Krakow, Poland, 2014.Google Scholar
- F. Fuhrmann and R. Kaiser. Multimodal interaction for future control centers - an interactive demonstrator. In ACM International Conference on Multimodal Interaction (ICMI), 2014. Google ScholarDigital Library
- H. Fürntratt. Finger pointing accuracy on leap motion sensor. In Interfaces and Human Computer Interaction (IHCI), 2014.Google Scholar
- H. Fürntratt and H. Neuschmied. Evaluating pointing accuracy of kinect v2 sensor. In International Conference on Multimedia and Human-Computer Interaction (MHCI), 2014 (to appear).Google Scholar
- W.-S. Gan, J. Yang, and T. Kamakura. A review of parametric acoustic array in air. Applied Acoustics, 73(12):1211--1219, Dec. 2012.Google ScholarCross Ref
- R. Kaiser and W. Weiss. Media Production, Delivery and Interaction for Platform Independent Systems: Format-Agnostic Media, chapter Virtual Director. Wiley, 2014.Google Scholar
- H. Kim, G. Albuquerque, S. Havemann, and D. W. Fellner. Tangible 3d: Hand gesture interaction for immersive 3d modeling. In Proceedings of the 11th Eurographics Conference on Virtual Environments, EGVE'05, pages 191--199. Eurographics Association, 2005. Google ScholarDigital Library
- C. Leitner. Comparison of voice activity detection methods in realistic noise scenarios. In Proceedings of the 6th Congress of the Alps Adria Acoustics Association, 2014.Google Scholar
- F. Pokorny and F. Graf. Akustische Vermessung parametrischer Lautsprecherarrays im Kontext der Transauraltechnik. In DAGA, pages 618--619, Oldenburg, Germany, 2014.Google Scholar
- D. Wigdor and D. Wixon. Brave NUI World: Designing Natural User Interfaces for Touch and Gesture. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA, 1st edition, 2011. Google ScholarDigital Library
Index Terms
- Multimodal Interaction for Future Control Centers: Interaction Concept and Implementation
Recommendations
Multimodal interaction in process control rooms: are we there yet?
PerDis '16: Proceedings of the 5th ACM International Symposium on Pervasive DisplaysAlthough process control rooms are advanced digital environments with a multitude of desktop and large displays, there still exists a gap between the interaction technologies being employed in pervasive displays used in other settings and those used in ...
Multimodal Interaction for Future Control Centers: An Interactive Demonstrator
ICMI '14: Proceedings of the 16th International Conference on Multimodal InteractionThis interactive demo exhibits a visionary multimodal interaction concept designed to support operators in future control centers. The applied multi-layered hardware and software architecture directly supports the operators in performing their lengthy ...
Dynamic user interface distribution for flexible multimodal interaction
ICMI-MLMI '10: International Conference on Multimodal Interfaces and the Workshop on Machine Learning for Multimodal InteractionThe availability of numerous networked interaction devices within smart environments makes the exploitation of these devices for innovative and more natural interaction possible. In our work we make use of TVs with remote controls, picture frames, ...
Comments