Abstract
This paper presents a study that allows users to define intuitive gestures to navigate a humanoid robot. For eleven navigational commands, 385 gestures, performed by 35 participants, were analyzed. The results of the study reveal user-defined gesture sets for both novice users and expert users. In addition, we present, a taxonomy of the user-defined gesture sets, agreement scores for the gesture sets, time performances of the gesture motions, and present implications to the design of the robot control, with a focus on recognition and user interfaces.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Efron, D.: Gesture and Environment. King’s Crown Press, Morningside Heights, New York (1941)
Hu, C., Meng, M., Liu, P., Wang, X.: Visual gesture recognition for human-machine interface of robot teleoperation. In: Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2003), vol. 2, pp. 1560–1565 (October 2003)
Konda, K.R., Königs, A., Schulz, H., Schulz, D.: Real time interaction with mobile robots using hand gestures. In: Proceedings of the Seventh Annual ACM/IEEE International Conference on Human-Robot Interaction, HRI 2012, pp. 177–178. ACM, New York (2012)
Kray, C., Nesbitt, D., Dawson, J., Rohs, M.: User-defined gestures for connecting mobile phones, public displays, and tabletops. In: Proceedings of the 12th International Conference on Human Computer Interaction with Mobile Devices and Services, MobileHCI 2010, pp. 239–248. ACM, New York (2010)
Kurdyukova, E., Redlin, M., André, E.: Studying user-defined ipad gestures for interaction in multi-display environment. In: International Conference on Intelligent User Interfaces, pp. 1–6 (2012)
McNeill, D.: So you think gestures are nonverbal? Psychological Review 92(3), 350–371 (1985)
McNeill, D.: Head and Mind: What Gestures Reveal About Thought. University of Chicago Press, Chicago (1992)
Saffer, D.: Designing Gestural Interfaces. O’Reilly Media, Sebastopol (2009)
Salem, M., Rohlfing, K., Kopp, S., Joublin, F.: A friendly gesture: Investigating the effect of multimodal robot behavior in human-robot interaction. In: RO-MAN, 2011 IEEE, July 31-August 3, pp. 247–252 (2011)
Sato, E., Yamaguchi, T., Harashima, F.: Natural interface using pointing behavior for human ndash;robot gestural interaction. IEEE Transactions on Industrial Electronics 54(2), 1105–1112 (2007)
Stiefelhagen, R., Fugen, C., Gieselmann, R., Holzapfel, H., Nickel, K., Waibel, A.: Natural human-robot interaction using speech, head pose and gestures. In: Proceedings of 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2004), September 28-October 2, vol. 3, pp. 2422–2427 (2004)
Suay, H.B., Chernova, S.: Humanoid robot control using depth camera. In: Proceedings of the 6th International Conference on Human-Robot Interaction, HRI 2011, pp. 401–402. ACM, New York (2011)
Wobbrock, J.O., Morris, M.R., Wilson, A.D.: User-defined gestures for surface computing. In: Proceedings of the 27th International Conference on Human Factors in Computing Systems, pp. 1083–1092. ACM, New York (2009)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Obaid, M., Häring, M., Kistler, F., Bühling, R., André, E. (2012). User-Defined Body Gestures for Navigational Control of a Humanoid Robot. In: Ge, S.S., Khatib, O., Cabibihan, JJ., Simmons, R., Williams, MA. (eds) Social Robotics. ICSR 2012. Lecture Notes in Computer Science(), vol 7621. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-34103-8_37
Download citation
DOI: https://doi.org/10.1007/978-3-642-34103-8_37
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-34102-1
Online ISBN: 978-3-642-34103-8
eBook Packages: Computer ScienceComputer Science (R0)