Next Article in Journal
Portable ECG System Design Using the AD8232 Microchip and Open-Source Platform
Previous Article in Journal
3D Pavement Surface Reconstruction Using An RGB-D Sensor
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Proceeding Paper

Real-Time Motion Tracking for Humans and Robots in a Collaborative Assembly Task †

Department of Mechanical Engineering, Chair for Production Automation and Assembly, University of Siegen, 57068 Siegen, Germany
*
Author to whom correspondence should be addressed.
Presented at the 6th International Electronic Conference on Sensors and Applications, 15–30 November 2019; Available online: https://ecsa-6.sciforum.net/.
Proceedings 2020, 42(1), 48; https://doi.org/10.3390/ecsa-6-06636
Published: 14 November 2019

Abstract

:
Human-robot collaboration combines the extended capabilities of humans and robots to create a more inclusive and human-centered production system in the future. However, human safety is the primary concern for manufacturing industries. Therefore, real-time motion tracking is necessary to identify if the human worker body parts enter the restricted working space solely dedicated to the robot. Tracking these motions using decentralized and different tracking systems requires a generic model controller and consistent motion exchanging formats. In this work, our task is to investigate a concept for a unified real-time motion tracking for human-robot collaboration. In this regard, a low cost and game-based motion tracking system, e.g., HTC Vive, is utilized to capture human motion by mapping into a digital human model in the Unity3D environment. In this context, the human model is described using a biomechanical model that comprises joint segments defined by position and orientation. Concerning robot motion tracking, a unified robot description format is used to describe the kinematic trees. Finally, a concept of assembly operation that involves snap joining is simulated to analyze the performance of the system in real-time capability. The distribution of joint variables in spatial-space and time-space is analyzed. The results suggest that real-time tracking in human-robot collaborative assembly environments can be considered to maximize the safety of the human worker. However, the accuracy and reliability of the system regarding system disturbances need to be justified.

1. Introduction

Human-robot collaboration is currently becoming more useful for assembly operations. Some assembly tasks such as snap-joining induce high acceleration during the engagement or disengagement of two parts [1]. A combined motion tracking approach for both humans and robots is necessary to develop a concept of safe human-robot physical collaboration [2]. In assembly tasks such as snap-joining, basic motions such as pick—reach—join—apply force—move are the typical activities to be applied. Such motions require a control mechanism to ensure the position and orientation of joint variables. Similarly, the safety issue has to be ensured for the human worker, particularly to address the motion outliers created due to the high acceleration. If a real-time motion tracking is assumed, the robot motion is prescribed by the motion planning algorithm. In the same way, joint state motion tracking can be used to monitor and control. This approach is considered to develop a closed-loop control for controlling the robot in real-time [3]. Therefore, real-time motion tracking, path planning, assembly task description and system configurations are crucial to derive a concept for real-time assembly operation to create a safe physical collaboration between humans and robots.
Real-time motion tracking for human-robot collaboration can be considered from the perspective of tracking systems and system capabilities. Different tracking systems have been employed to capture motions. Such systems can be optical systems or inertial measurement units [4,5,6,7]. Path planning in assembly tasks considers the modality of human motion behavior, task descriptions and robot motions. Planning assembly tasks in another way defines the sequence of operation and component attributes. For tasks that involve applied force, parameters such as the magnitude of applied force, the orientation of action, controlling mechanisms and retraction methods are considered. In snap-joining usually force is applied along a specific orientation. The work of [8] considered assembly tasks for elastic parts using dual robot arms and a human. The assembly process involved the insertion of o-rings into a cylinder. The human hand movement was captured to derive a concept and a strategy for generating robot motion. In this case, a leap sensor was used to capture a finger movement.
Monitoring and visualization for hybrid systems nowadays are more simplified due to the capabilities of middleware systems to communicate via user datagram protocol (UDP) / transmission control protocol (TCP) communications. In the robotics field, some industrial robots are being supported by the robot operating system industrial (ROS-I). ROS-I communities actively share developments and concepts. This system is advantageous to implement user-oriented controlling techniques. ROS-I is a meta-operating system, which is mainly developed for industrial systems [9]. Various plugins such as MoveIt! and Gazebo are commonly used to generate motion trajectories and to model the geometry simultaneously. However, geometric visualization and rendering of Gazebo are not good enough. In this regard, gaming software such as Unity3D can be considered as a potential alternative to obtain good graphical rendering, particularly for virtual commissioning or process monitoring. Recent works in this regard show Unity3D can be coupled with a robot operating system such as ROS-I to enable real-time streaming or offline simulation. Mainly, the authors in [10] presented a framework to simulate and monitor industrial processes using Unity3D. However, this work does not address how the real-time motion tracking can be applied to human-robot collaboration. Regarding the application of Unity3D for human-robot interaction, initial work is presented in [11], which considers a Unity3D-based robot engine to control robots. Recently, ROS# [12] was developed by Siemens to simplify ROS-I and Unity3D systems interfacing through WebSockets.
In this work, our task is to investigate a method to implement a simplified and decentralized motion tracking using a consistent file exchange format both for human and robot models in real-time using low-cost motion capturing systems. In this context, we implement digital models for both the human and the robot that applies a kinematic model to control the movement of the human and robot skeletons. We combine robot joint motions and human motion in a single intuitive graphical interface to track motions using sensory systems easily. In this regard, a low-cost motion tracking system which is primarily commercialized for gaming is used to develop the concept of motion tracking in human-robot collaborative environments. The generic model controller achieves the overall objective for both robot and human joints in the Unity3D environment. In the meantime, ROS-I and Steam VR systems are employed to facilitate the communication between the graphical controllers and data flow. In the model, kinematic trees are used to prescribe joint motions during each motion step. Finally, the motion data is visualized and analyzed in terms of distribution and position.

2. Materials and Methods

As pointed out in the introduction section, robot operating systems (ROS) are famous for robot control, allowing users to implement their desired motion controlling strategies using a programmatic approach [9,13,14]. This approach is essential to integrate sensors and high level controlling strategies whenever it is necessary. However, the community of ROS-I commonly uses Python and C++ programming languages to develop drivers and controllers. Gaming software such as Unity3D natively uses C# language, which makes ROS-I integration not straightforward. A ROS# asset developed by Siemens was recently used to simplify the system configuration and it was released under the license of Apache License 2.0. It provides a simplistic interface approach between ROS-I and Unity3D systems based on WebSockets. We applied this framework to integrate the robot system, HTC Vive system and ROS-I system on a single machine, i.e., the Linux operating system ROS-I was the master that controlled the communication data from each system and Unity3D was used to control the digital models for virtual process verification. As a gaming engine, the Unity3D environment has a better graphical rendering capability to simulate a virtual scene at a high frame rate. Therefore, it was easy to observe the details of motion behaviors during real-time tracking.
In general, the human-robot collaborative assembly task involves a method to represent the digital human model, which is used to map tracking systems into joint variables. In this context, we created a human avatar using make-human packages [15]. The digital human model was configured according to the worker size before it was exported as an FBX file. In this model, there are 53 joints. However, only 9 trackers were attached to the human body and the remaining body joints were neglected. Those HTC Vive trackers were connected to the body using different size straps (see Figure 1).
At the same time, the geometric model for the collaborative robot, e.g., universal robot, was imported into the Unity3D environment. A unified robot descriptive file importer in ROS# helped to configure the digital robot model into Unity3D easily. Finally, by launching the file server, we enabled ROS-I communication through WebSocket using a single computer that used a Linux operating system (see Figure 2).
Developing the tracking system in general allows trajectory planning for the robot motion then the robot follows the defined trajectory. The assembly task is a sequence of motions such as pick—reach—join—apply force—move. The concept is that the human worker pushes the second object to be assembled on the first object, which is gripped by the robot gripper. Then, the human worker will apply pressure to fit both objects together. The concept of the experimental design is shown in Figure 3. The human operator was tracked by the HTC Vive trackers and the robot system was tracked through the ROS-I system.

3. Results

Human-robot motion collaboration using a low-cost tracking system was applied to derive a concept of how a human worker behaves during an assembly process. The motion was parameterized into different levels for both humans and robots. For the human, the task was defined as a sequence of motions such as reach—join—apply force—release. The robot remained at the position during an assembly process based on a wait function. Once the assembly process was performed the robot started to move to the target position. Therefore, the task execution was monitored based on time-space (see Table 1).
The overview of the motion phenomenon is demonstrated in Figure 4. In this regard, the motion of the right-hand wrist and the robot tool center position is shown. This motion was motion-captured during the real-time execution.
In this work, the motion was captured for human joints and robot joints. The results were visualized in two-dimensional spaces to describe how the X-components and Y-components behaved. Figure 5a shows the projection of XY-components, which shows the motion of the arm moves towards the robot tool center position. In the same way, the distribution of X-components is shown in Figure 5b. The labels A and B show the motion toward the robot tool center position and the position of the object which was not constrained respectively.
This work is a part of ongoing research and therefore further analysis regarding time-space and positioning accuracy will be considered in the next work.

4. Discussion

In this work, we presented a concept of real-time motion tracking for assembly processes in human-robot collaborative environments using a configuration of different systems such as ROS-I and Unity3D. The key achievements of this work were generic model controllers for human and robot motion tracking using a kinematic tree structure. In the same manner, consistent motion capturing formats and exchanges were investigated. Similarly, the capability of low cost and commercially available gaming equipment such as HTC Vive was tested. In the overall process, a single operating system (i.e., Linux (Ubuntu 18.04)) was used to run both ROS-I and Unity3D systems. In this configuration, a concept of an assembly process was described into reach—join—apply force—release—move motion types which were defined to be executed by the human and robot. Depending on the preliminary evaluation, await function was defined to classify tasks and its description for each. This approach does not optimize the assembly process and cycle time. In the same manner, real-time robot control was not implemented to avoid a probability of collision of the human operator moves to the robot path before it reached the assembly position. In this regard, this work serves as an initial step to address safety parameters.
From the result of this work, it can be concluded that a real-time tracking in human-robot collaborative assembly environment can be considered to maximize human worker safety. In this regard, the reliability and robustness of the system regarding system disturbances need to be considered.

Acknowledgments

The authors would like to acknowledge the financial support by the Federal Ministry of Education and Research of Germany within the ITEA3 project MOSIM (grant number: 01IS18060AH).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. You, B.; Lou, Z.; Luo, Y.; Xu, Y.; Wang, X. Prediction of Pressing Quality for Press-Fit Assembly Based on Press-Fit Curve and Maximum Press-Mounting Force. Int. J. Aerosp. Eng. 2015, 2015. [Google Scholar] [CrossRef]
  2. Darvish, K.; Tirupachuri, Y.; Romualdi, G.; Rapetti, L.; Ferigo, D.; Chavez, F.J.A.; Pucci, D. Whole-Body Geometric Retargeting for Humanoid Robots. arXiv 2019, arXiv:1909.10080. [Google Scholar]
  3. Tuli, T.B.; Manns, M. Hierarchical motion control for real time simulation of industrial robots. Procedia CIRP 2019, 81, 713–718. [Google Scholar] [CrossRef]
  4. Caputo, F.; Greco, A.; D’Amato, E.; Notaro, I.; Spada, S. IMU-Based Motion Capture Wearable System for Ergonomic Assessment in Industrial Environment. In Proceedings of the Advances in Human Factors in Wearable Technologies and Game Design, Orlando, FL, USA, 21–25 July 2018. [Google Scholar]
  5. Filippeschi, A.; Schmitz, N.; Miezal, M.; Bleser, G.; Ruffaldi, E.; Stricker, D. Survey of Motion Tracking Methods Based on Inertial Sensors: A Focus on Upper Limb Human Motion. Sensors 2017, 17, 1257. [Google Scholar] [CrossRef] [PubMed]
  6. Elgendi, M.; Picon, F.; Magnenat-Thalmann, N.; Abbott, D. Arm movement speed assessment via a Kinect camera: A preliminary study in healthy subjects. Biomed. Eng. Online 2014, 13, 88. [Google Scholar] [CrossRef] [PubMed]
  7. Aurand, A.M.; Dufour, J.S.; Marras, W.S. Accuracy map of an optical motion capture system with 42 or 21 cameras in a large measurement volume. J. Biomed. 2017, 58, 237–240. [Google Scholar] [CrossRef] [PubMed]
  8. Ramirez-Alpizar, I.G.; Harada, K.; Yoshida, E. Human-based framework for the assembly of elastic objects by a dual-arm robot. Robomech. J. 2017, 4, 20. [Google Scholar] [CrossRef]
  9. Koubaa, A. (Ed.) Robot Operating System (ROS): The Complete Reference (Volume 2); Studies in Computational Intelligence; Springer International Publishing: New York, NY, USA, 2017. [Google Scholar]
  10. Sita, E.; Horváth, C.M.; Thomessen, T.; Korondi, P.; Pipe, A.G. ROS-Unity3D based system for monitoring of an industrial robotic process. In Proceedings of the 2017 IEEE/SICE International Symposium on System Integration (SII), Taipei, Taiwan, 11–14 December 2017; pp. 1047–1052. [Google Scholar]
  11. Bartneck, C.; Soucy, M.; Fleuret, K.; Sandoval, E.B. The robot engine—Making the unity 3D game engine work for HRI. In Proceedings of the 2015 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Kobe, Japan, 31 August–4 September 2015; pp. 431–437. [Google Scholar]
  12. Available online:. Available online: https://github.com/siemens/ros-sharp (accessed on 9 October 2019).
  13. Glogowski, P.; Lemmerz, K.; Hypki, A.; Kuhlenkötter, B. ROS-Based Robot Simulation in Human-Robot Collaboration. In Developing Support Technologies: Integrating Multiple Perspectives to Create Assistance That People Really Want; Karafillidis, A., Weidner, R., Eds.; Biosystems & Biorobotics; Springer International Publishing: Cham, Switzerland, 2018; pp. 237–246. ISBN 978-3-030-01836-8. [Google Scholar]
  14. Roldán, J.J.; Peña-Tapia, E.; Garzón-Ramos, D.; de León, J.; Garzón, M.; del Cerro, J.; Barrientos, A. Multi-robot Systems, Virtual Reality and ROS: Developing a New Generation of Operator Interfaces. In Robot Operating System (ROS): The Complete Reference (Volume 3); Koubaa, A., Ed.; Studies in Computational Intelligence; Springer International Publishing: Cham, Germany, 2019; pp. 29–64. ISBN 978-3-319-91590-6. [Google Scholar]
  15. MakeHuman-Art-Expo in DeviantArt. Available online: http://www.makehumancommunity.org/ (accessed on 9 October 2019).
Figure 1. Tracker assignment and orientation alignment for human motion tracking using HTC Vive trackers (9 trackers are used): (a) Trackers fixed on human body parts; (b) a digital human model; and (c) a skeleton control and joint locations.
Figure 1. Tracker assignment and orientation alignment for human motion tracking using HTC Vive trackers (9 trackers are used): (a) Trackers fixed on human body parts; (b) a digital human model; and (c) a skeleton control and joint locations.
Proceedings 42 00048 g001
Figure 2. System configuration for robot operating system industrial (ROS-I) and Unity3D based assembly process planning and control (The dashed lines show an alternative configuration).
Figure 2. System configuration for robot operating system industrial (ROS-I) and Unity3D based assembly process planning and control (The dashed lines show an alternative configuration).
Proceedings 42 00048 g002
Figure 3. The concept for virtual simulation and real-time motion tracking for collaborative human-robot assembly tasks; (a) Experimental design for assembly tasks using human-robot collaboration; (b) Unity3D environment showing system configuration and practical realization.
Figure 3. The concept for virtual simulation and real-time motion tracking for collaborative human-robot assembly tasks; (a) Experimental design for assembly tasks using human-robot collaboration; (b) Unity3D environment showing system configuration and practical realization.
Proceedings 42 00048 g003
Figure 4. Motion parameterization for human-robot collaboration; (a) Wait for the robot until it reaches the defined position; (b) The human arm extended to reach the robot tool center position; (c) The robot moves the object to the target position using a square path. The red color shows the robot path and the green lines show the wrist path.
Figure 4. Motion parameterization for human-robot collaboration; (a) Wait for the robot until it reaches the defined position; (b) The human arm extended to reach the robot tool center position; (c) The robot moves the object to the target position using a square path. The red color shows the robot path and the green lines show the wrist path.
Proceedings 42 00048 g004
Figure 5. Wrist path for 50 steps. (a) Projection of XY-components for human arm motion; (b) Distribution of the X-component.
Figure 5. Wrist path for 50 steps. (a) Projection of XY-components for human arm motion; (b) Distribution of the X-component.
Proceedings 42 00048 g005
Table 1. Motion assignment and task descriptions.
Table 1. Motion assignment and task descriptions.
Motion TypeHumanRobot
ReachThe human arm will reach to the robot tool center positionIt moves from the home position to the desired position
JoinThe human arm will move part two to align its orientation with part one. It waits for joining processing
Apply forceA force will be applied to assemble the parts by force-fittingIt waits for joining processing
ReleaseThe human hand releases the partIt waits for the process
MoveThe human arm retracts back to the object positionIt moves the assembled part to the handling area.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Tuli, T.B.; Manns, M. Real-Time Motion Tracking for Humans and Robots in a Collaborative Assembly Task. Proceedings 2020, 42, 48. https://doi.org/10.3390/ecsa-6-06636

AMA Style

Tuli TB, Manns M. Real-Time Motion Tracking for Humans and Robots in a Collaborative Assembly Task. Proceedings. 2020; 42(1):48. https://doi.org/10.3390/ecsa-6-06636

Chicago/Turabian Style

Tuli, Tadele Belay, and Martin Manns. 2020. "Real-Time Motion Tracking for Humans and Robots in a Collaborative Assembly Task" Proceedings 42, no. 1: 48. https://doi.org/10.3390/ecsa-6-06636

Article Metrics

Back to TopTop