Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2022 Mar 2;22(5):1968.
doi: 10.3390/s22051968.

Motion Similarity Evaluation between Human and a Tri-Co Robot during Real-Time Imitation with a Trajectory Dynamic Time Warping Model

Affiliations

Motion Similarity Evaluation between Human and a Tri-Co Robot during Real-Time Imitation with a Trajectory Dynamic Time Warping Model

Liang Gong et al. Sensors (Basel). .

Abstract

Precisely imitating human motions in real-time poses a challenge for the robots due to difference in their physical structures. This paper proposes a human-computer interaction method for remotely manipulating life-size humanoid robots with a new metrics for evaluating motion similarity. First, we establish a motion capture system to acquire the operator's motion data and retarget it to the standard bone model. Secondly, we develop a fast mapping algorithm, by mapping the BVH (BioVision Hierarchy) data collected by the motion capture system to each joint motion angle of the robot to realize the imitated motion control of the humanoid robot. Thirdly, a DTW (Dynamic Time Warping)-based trajectory evaluation method is proposed to quantitatively evaluate the difference between robot trajectory and human motion, and meanwhile, visualization terminals render it more convenient to make comparisons between two different but simultaneous motion systems. We design a complex gesture simulation experiment to verify the feasibility and real-time performance of the control method. The proposed human-in-the-loop imitation control method addresses a prominent non-isostructural retargeting problem between human and robot, enhances robot interaction capability in a more natural way, and improves robot adaptability to uncertain and dynamic environments.

Keywords: BioVision hierarchy; DTW-based trajectory evaluation; human-in-the-loop control; life-size humanoid robot; motion capture; motion imitation.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflict of interest.

Figures

Figure 1
Figure 1
DOF of the humanoid robot (DOF of fingers are not shown).
Figure 2
Figure 2
Whole structure of the proposed method.
Figure 3
Figure 3
Visualized data stream through ROS publish_subscribe messaging.
Figure 4
Figure 4
Designed Communication Protocol.
Figure 5
Figure 5
Three motion systems with different constraints. (a) shows the human motion system with biological constraint. (b) shows the BVH motion system with no constraint. (c) shows the robot motion system with mechanical constraint.
Figure 6
Figure 6
Elbow conversion from 2 DOF to 1 DOF. x1y1z1 and x2y2z2 are the DH coordinate systems connecting the two links of the elbow joint, respectively. Ω is the bending angle of the elbow joint.
Figure 7
Figure 7
Wrist conversion from 2 DOF to 1 DOF. x1y1z1 and x2y2z2 are the DH coordinate systems connecting the two links of the elbow joint, respectively. ω is the rotating angle of the elbow joint.
Figure 8
Figure 8
Elbow joint angle map.
Figure 9
Figure 9
Wrist joint angle map.
Figure 10
Figure 10
Schematic diagram of three human trajectories mapped to the same robot trajectory.
Figure 11
Figure 11
Different visualization terminals for different motion systems.
Figure 12
Figure 12
Experiments of different gestures with arms and head.
Figure 13
Figure 13
Comparison between fingers.
Figure 14
Figure 14
Snapshots for motion trajectory.
Figure 15
Figure 15
DTW distance in the experiment.

Similar articles

Cited by

References

    1. Yavşan E., Uçar A. Gesture imitation and recognition using Kinect sensor and extreme learning machines. Measurement. 2016;94:852–861. doi: 10.1016/j.measurement.2016.09.026. - DOI
    1. Xu W., Li X., Xu W., Gong L., Huang Y., Zhao Z., Zhao L., Chen B., Yang H., Cao L., et al. Human-robot Interaction Oriented Human-in-the-loop Real-time Motion Imitation on a Humanoid Tri-Co Robot; Proceedings of the 2018 3rd International Conference on Advanced Robotics and Mechatronics (ICARM); Singapore. 18–20 July 2018; pp. 781–786. - DOI
    1. Riley M., Ude A., Wade K., Atkeson C.G. Enabling real-time full-body imitation: A natural way of transferring human movement to humanoids; Proceedings of the IEEE International Conference on Robotics and Automation, ICRA; Taipei, Taiwan. 14–19 September 2003; pp. 2368–2374.
    1. Durdu A., Cetin H., Komur H. Robot imitation of human arm via Artificial Neural Network; Proceedings of the International Conference on Mechatronics-Mechatronika; Brno, Czech Republic. 5–7 December 2015; pp. 370–374.
    1. Hyon S.H., Hale J.G., Cheng G. Full-Body Compliant Human–Humanoid Interaction: Balancing in the Presence of Unknown External Forces. IEEE Trans. Robot. 2007;23:884–898. doi: 10.1109/TRO.2007.904896. - DOI

LinkOut - more resources

-