Learning Two-Person Interaction Models for Responsive Synthetic Humanoids

Authors

DOI:

https://doi.org/10.20385/1860-2037/11.2014.1

Keywords:

humanoid robots, imitation learning, interaction learning, motion adaptation, motor learning, virtual characters

Abstract

Imitation learning is a promising approach for generating life-like behaviors of virtual humans and humanoid robots. So far, however, imitation learning has been mostly restricted to single agent settings where observed motions are adapted to new environment conditions but not to the dynamic behavior of interaction partners. In this paper, we introduce a new imitation learning approach that is based on the simultaneous motion capture of two human interaction partners. From the observed interactions, low-dimensional motion models are extracted and a mapping between these motion models is learned. This interaction model allows the real-time generation of agent behaviors that are responsive to the body movements of an interaction partner. The interaction model can be applied both to the animation of virtual characters as well as to the behavior generation for humanoid robots.

Downloads

Published

2014-01-31

How to Cite

Vogt, D., Ben Amor, H., Berger, E., & Jung, B. (2014). Learning Two-Person Interaction Models for Responsive Synthetic Humanoids. Journal of Virtual Reality and Broadcasting, 11. https://doi.org/10.20385/1860-2037/11.2014.1

Issue

Section

GI VR/AR 2012