ABSTRACT
Understanding and estimating human attention in different interactive scenarios is an important part of human computer interaction. With the advent of wearable eye-tracking glasses and Google glasses, monitoring of human visual attention will soon become ubiquitous. The presented work describes the precise estimation of human gaze fixations with respect to its environment, without the need of artificial landmarks in the field of view, and being capable of providing attention mapping onto 3D information. It enables full 3D recovery of the human view frustum and the gaze pointer in a previously acquired 3D model of the environment in real time. The key contribution is that our methodology enables mapping of fixations directly into an automatically computed 3d model. This innovative methodology will open new opportunities for human attention studies during interaction with its environment, bringing new potential into automated processing for human factors technologies.
- Pirker, K., Schweighofer, G., Rüther, M., Bischof, H.: GPSlam: Marrying Sparse Geometric and Dense Probabilistic Visual Mapping, Proc. BMVC, 2011.Google ScholarCross Ref
- Civera, J. and Davison, A.J. and Montiel, J.: Inverse Depth Parametrization for Monocular SLAM, IEEE Transactions on Robotics, vol. 24, pp. 932--945, 2008. Google ScholarDigital Library
- H. Strasdat, A.J. Davison, J.M.M. Montiel, and K. Konolige: Double Window Optimisation for Constant Time Visual SLAM, Proc. IEEE ICCV, 2011. Google ScholarDigital Library
- Lowe, David G.: Distinctive image features from scale-invariant keypoints, IJCV, vol. 60, pp. 91--110. Google ScholarDigital Library
- Nister, D. and Stewenius, H.: Scalable Recognition with a Vocabulary Tree, Proc. CVPR, 2006. Google ScholarDigital Library
- Gottschalk S. & Lin M. C. & Manocha D.: OBB-Tree: A Hierarchical Structure for Rapid Interference Detection, Proc. 23rd Annual Conference on Computer Graphics and Interactive Techniques, 1996. Google ScholarDigital Library
- Lepetit V., Moreno-Noguer F. and Fua P.: EPnP: An Accurate O(n) Solution to the PnP Problem, IJCV, pp. 155--166, 2009. Google ScholarDigital Library
- Munn, S. M., and Pelz J. B.: 3D POR, position and head orientation from a portable monocular videobased eye tracker. Proc. ETRA, pp. 181--188, 2008. Google ScholarDigital Library
- Voßkühler A., Nordmeier V. and Herholz S.: Gaze3D - Measuring gaze movements during experimentation of real physical experiments. Proc. ECEM, 2009.Google Scholar
- Pirri, F., Pizzoli, M., Rudi, A.: A general method for the POR estimation in 3D space. Proc. CVPR, 2011. Google ScholarDigital Library
- Schrammel, J., Döbelt. S., Paletta, L., Tscheligi, M., Attentional Behavior of Users on the Move Towards Pervasive Advertising Elements, in eds., Müller, Alt, Michelis, Pervasive Advertising, Springer, 2011.Google Scholar
- Paletta, L., and Tsotsos, J.K., Eds., Attention in Cognitive Systems, LNAI 5395, Springer, 2009.Google ScholarCross Ref
- Judd, T., Ehinger, K., Durand, F., and Torralba, A., Learning to predict where humans look, Proc. IEEE ICCV, 2009.Google ScholarCross Ref
Index Terms
- 3D attention: measurement of visual saliency using eye tracking glasses
Recommendations
Attention in mobile interactions: gaze recovery for large scale studies
CHI EA '14: CHI '14 Extended Abstracts on Human Factors in Computing SystemsUnderstanding human attention in mobile interaction is a relevant part of human computer interaction, indica-ting focus of task, emotion and communication. Lack of large scale studies enabling statistically significant re-sults is due to high costs of ...
Survey of recent advances in 3D visual attention for robotics
3D visual attention plays an important role in both human and robotics perception that yet has to be explored in full detail. However, the majority of computer vision and robotics methods are concerned only with 2D visual attention. This survey presents ...
Can a Robot's Hand Bias Human Attention?
HRI '23: Companion of the 2023 ACM/IEEE International Conference on Human-Robot InteractionPrevious studies have revealed that humans prioritize attention to the space near their hands (the so-called near-hand effect). This effect may also occur towards a human partner's hand, but only after sharing a physical joint action. Hence, in human ...
Comments