Skip to main content
Log in

Computer-automated ergonomic analysis based on motion capture and assembly simulation

  • Original Article
  • Published:
Virtual Reality Aims and scope Submit manuscript

Abstract

This paper describes a method of simulating an assembly operation in a fully immersive virtual environment in order to analyze the postures of workers as they perform assembly operations in aerospace manufacturing. The challenges involved in capturing the movements of humans performing an assembly operation in a real work environment were overcome by developing a marker-based motion capture system and using it in a cave automatic virtual environment (CAVE). The development of the system focuses on real-time human motion capture and automated simulation for ergonomic analysis. Human movements were tracked in a CAVE, with infrared (IR) LEDs mounted on a human body. The captured motion data were used to generate a simulation in real-time and perform an ergonomic analysis in Jack software. The simulation also included the use of Microsoft Kinect as a marker-less human body capture system for the purpose of scaling the digital human model in Jack. The developed system has been demonstrated for human motion capture and ergonomic analysis for the fastening operation of an aircraft fuselage.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

References

  • AForge.NET: Computer vision, artificial intelligence, robotics (2012). http://www.aforgenet.com/framework/

  • Badler N, Phillips CB, Webber BL (1999) Simulating humans: computer graphics, animation and control. University of Pennsylvania, Oxford University Press, Department of Computer and Information Science

    Google Scholar 

  • Bouguet JY, Camera calibration toolbox for Matlab (2012). http://www.vision.caltech.edu/bouguetj/calib_doc/index.html

  • Bureau of Labor Statistics (2007). http://www.bls.gov/opub/ted/2008/dec/wk1/art02.htm

  • Chadda A, Zhu W, Leu MC, Liu X (2011) Design, implementation and evaluation of optical low-cost motion capture system. In: proceedings of the ASME international design engineering technical conferences and computers and information in engineering conference (IDETC/CIE), August 29–31, Washington, DC, USA

  • Fernando T, Marcelino L, Wimalaratne P, Tan K (2000) Interactive assembly modeling within a CAVE environment. In: proceedings of Eurographics-Portuguese Chapter, p 43–49

  • Frati V, Prattichizzo D (2011) Using Kinect for hand tracking and rendering in wearable haptics. In: proceedings of IEEE world haptics conference, June 21–24, Istanbul, Turkey, p 317–321

  • Gaonkar R, Madhavan V, Zhao W (2006) Virtual assembly operations with grasp and verbal interaction. In: proceedings of the ACM international conference on virtual reality continuum and its applications, June 14–17, Chinese University of Hong Kong, Hong Kong, p 245–254

  • Hartley RI, Sturm P (1997) Triangulation. Comput Vis Image Underst 68(2):146–157

    Article  Google Scholar 

  • Joshi AS, Leu MC, Murray S (2008) Ergonomic impact of fastening operation. In: proceedings of 2nd CIRP conference on assembly technologies and systems, September 21–23, Toronto, ON, Canada

  • Kinect for Windows SDK (2012). http://www.microsoft.com/en-us/kinectforwindows/

  • Kinect Xbox (2012). http://www.xbox.com/en-US/KINECT

  • Kurihara K, Hoshino S, Yamane K, Nakamura Y (2002) Optical motion capture system with pan-tilt camera tracking and real-time data processing. In: proceedings of the IEEE International conference on robotics and automation, May 11–15, Washington, DC, USA, p 1241–1248

  • McAtamney L, Corlet EN (1993) RULA: a survey method for the investigation of world-related upper limb disorders. Appl Ergon 24(2):91–99

    Article  Google Scholar 

  • Phasespace Motion Capture (2012). http://www.phasespace.com/

  • Point Grey Research (2012). http://www.ptgrey.com/

  • Stowers J, Hayes M, Bainbridge-Smith A (2011) Altitude control of a quadrotor helicopter using depth map from Microsoft Kinect sensor. In: proceedings of the IEEE international conference on mechatronics, April 13–15, Istanbul, Turkey, p 358–362

  • Zhang Z (2000) A flexible new technique for camera calibration. IEEE Trans Patt Mach Intell 22(11):1330–1334

    Article  Google Scholar 

Download references

Acknowledgments

The authors would like to acknowledge the financial support for this research from the Industrial Consortium of the Center for Aerospace Manufacturing Technologies (CAMT). The great help of Peter Wu and Alpha Chang to initiate and conduct the project is especially appreciated.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sajeev C. Puthenveetil.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Puthenveetil, S.C., Daphalapurkar, C.P., Zhu, W. et al. Computer-automated ergonomic analysis based on motion capture and assembly simulation. Virtual Reality 19, 119–128 (2015). https://doi.org/10.1007/s10055-015-0261-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10055-015-0261-9

Keywords

Navigation