skip to main content
10.1145/1230100.1230125acmconferencesArticle/Chapter ViewAbstractPublication Pagesi3dConference Proceedingsconference-collections
Article

Quick transitions with cached multi-way blends

Published:30 April 2007Publication History

ABSTRACT

We describe a discriminative method for distinguishing natural-looking from unnatural-looking motion. Our method is based on physical and data-driven features of motion to which humans seem sensitive. We demonstrate that our technique is significantly more accurate than current alternatives.

We use this technique as the testing part of a hypothesize-and-test motion synthesis procedure. The mechanism we build using this procedure can quickly provide an application with a transition of user-specified duration from any frame in a motion collection to any other frame in the collection. During pre-processing, we search all possible 2-, 3-, and 4-way blends between representative samples of motion obtained using clustering. The blends are automatically evaluated, and the recipe (i.e., the representatives and the set of weighting functions) that created the best blend is cached.

At run-time, we build a transition between motions by matching a future window of the source motion to a representative, matching the past of the target motion to a representative, and then applying the blend recipe recovered from the cache to source and target motion. People seem sensitive to poor contact with the environment like sliding foot plants. We determine appropriate temporal and positional constraints for each foot plant using a novel technique, then apply an off-the-shelf inverse kinematics technique to enforce the constraints. This synthesis procedure yields good-looking transitions between distinct motions with very low online cost.

References

  1. Arikan, O., and Forsyth, D. A. 2002. Interactive motion generation from examples. In SIGGRAPH. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Arikan, O., Forsyth, D. A., and O'Brien, J. F. 2005. Pushing people around. In SCA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Arikan, O. 2006. Compression of motion capture databases. In SIGGRAPH 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Bindiganavale, R., and Badler, N. I. 1998. Motion abstraction and mapping with spatial constraints. In CAPTECH 1998, Springer-Verlag, London, UK, 70--82. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Bruderlin, A., and Williams, L. 1995. Motion signal processing. In SIGGRAPH. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Fowlkes, C., Belongie, S., Chung, F., and Malik, J. 2004. Spectral grouping using the nystrom method. In PAMI. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Gleicher, M., Shin, H. J., Kovar, L., and Jepsen, A. 2003. Snap-together motion: assembling run-time animations. In Symposium on Interactive 3D graphics. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Ikemoto, L., and Forsyth, D. A. 2004. Enriching a motion collection by transplanting limbs. In SCA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Ikemoto, L., Arikan, O., and Forsyth, D. 2006. Knowing when to put your foot down. In I3D. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Kovar, L., and Gleicher, M. 2002. Footskate cleanup for motion capture editing. In SCA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Kovar, L., and Gleicher, M. 2003. Flexible automatic motion blending with registration curves. In SCA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Kovar, L., and Gleicher, M. 2004. Automated extraction and parameterization of motions in large data sets. SIGGRAPH. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Kovar, L., Gleicher, M., and Pighin, F. 2002. Motion graphs. In SIGGRAPH. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Kovar, L., Schreiner, J., and Gleicher, M. 2002. Footskate cleanup for motion capture editing. In SCA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Kwon, T., and Shin, S. Y. 2005. Motion modeling for on-line locomotion synthesis. In SCA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Le Callennec, B., and Boulic, R. 2006. Robust Kinematic Constraint Detection for Motion Data. In SCA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Lee, J., Chai, J., Reitsma, P. S. A., Hodgins, J. K., and Pollard, N. S. 2002. Interactive control of avatars animated with human motion data. In SIGGRAPH. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Li, Y., Wang, T., and Shum, H.-Y. 2002. Motion texture: a two-level statistical model for character motion synthesis. In Computer graphics and interactive techniques. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Liu, C. K., and Popovic, Z. 2002. Synthesis of complex dynamic character motion from simple animations. In SIGGRAPH. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Menache, A. 1999. Understanding Motion Capture for Computer Animation and Video Games. Morgan-Kaufmann. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Molina-Tanco, L., and Hilton, A. 2000. Realistic synthesis of novel human movements from a database of motion capture examples. In Workshop on Human Motion. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Park, S. I., Shin, H. J., and Shin, S. Y. 2002. On-line locomotion generation based on motion blending. In SCA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Park, S. I., Shin, H. J., Kim, T. H., and Shin, S. Y. 2004. On-line motion blending for real-time locomotion generation: Research articles. Comput. Animat. Virtual Worlds. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Pothen, A., Simon, H., and Liou, K. 1990. Partitioning sparse matrices with eigenvectors of graphs. In SIAM Journal of Matrix Anal. Appl. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Ren, L., Patrick, A., Efros, A. A., Hodgins, J. K., and Rehg, J. M. 2005. A data-driven approach to quantifying natural human motion. SIGGRAPH. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Rose, C., Guenter, B., Bodenheimer, B., and Cohen, M. F. 1996. Efficient generation of motion transitions using spacetime constraints. In SIGGRAPH. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Rose, C., Cohen, M. F., and Bodenheimer, B. 1998. Verbs and adverbs: Multidimensional motion interpolation. IEEE Comput. Graph. Appl.. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Safonova, A., and Hodgins, J. K. 2005. Analyzing the physical correctness of interpolated human motion. In SCA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Shin, H. J., Lee, J., Shin, S. Y., and Gleicher, M. 2001. Computer puppetry: An importance-based approach. ACM Trans. Graph. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Shin, H. J., Kovar, L., and Gleicher, M. 2003. Physical touch-up of human motions. In Pacific Conference on Computer Graphics and Applications. Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. Sulejmanpašić, A., and Popović, J. 2005. Adaptation of performed ballistic motion. ACM Trans. Graph.. Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. Tak, S., young Song, O., and Ko, H.-S. 2000. Motion balance filtering. Eurographics.Google ScholarGoogle Scholar
  33. Vukobratovic, M., and Juricic, D. 1969. Contributions to the synthesis of biped gait. In IEEE Transactions on Biomedical Engineering.Google ScholarGoogle Scholar
  34. Wang, J., and Bodenheimer, B. 2003. An evaluation of a cost metric for selecting transitions between motion segments. In SCA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. Wang, J., and Bodenheimer, B. 2004. Computing the duration of motion transitions: an empirical approach. In SCA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. Wiley, D. J., and Hahn, J. K. 1997. Interpolation synthesis for articulated figure motion. In Virtual Reality Annual International Symposium. Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. Winter, D. 2005. Biomechanics and Motor Control of Human Movement, Third edition. John Wiley and Sons.Google ScholarGoogle Scholar
  38. Witkin, A., and Kass, M. 1988. Spacetime constraints. In SIGGRAPH. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Quick transitions with cached multi-way blends

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        I3D '07: Proceedings of the 2007 symposium on Interactive 3D graphics and games
        April 2007
        196 pages
        ISBN:9781595936288
        DOI:10.1145/1230100

        Copyright © 2007 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 30 April 2007

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • Article

        Acceptance Rates

        Overall Acceptance Rate148of485submissions,31%

        Upcoming Conference

        I3D '24
        Symposium on Interactive 3D Graphics and Games
        May 8 - 10, 2024
        Philadelphia , PA , USA

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader