Real-time Retargeting of Deictic Motion to Virtual Avatars for Augmented Reality Telepresence

2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)
Kang, Jiho and Yang, Dongseok and Kim, Taehei and Lee, Yewon and Lee, Sung-Hee

Abstract

Avatar-mediated augmented reality telepresence aims to enable distant users to collaborate remotely through avatars. When two spaces involved in telepresence are dissimilar, with different object sizes and arrangements, the avatar movement must be adjusted to convey the user’s intention rather than directly following their motion, which poses a significant challenge. In this paper, we propose a novel neural network-based framework for real-time retargeting of users’ deictic motions (pointing at and touching objects) to virtual avatars in dissimilar environments. Our framework translates the user’s deictic motion, acquired from a sparse set of tracking signals, to the virtual avatar’s deictic motion for a corresponding remote object in real-time. One of the main features of our framework is that a single trained network can generate natural deictic motions for various sizes of users. To this end, our network includes two sub-networks: AngleNet and MotionNet. AngleNet maps the angular state of the user’s motion into a latent representation, which is subsequently converted by MotionNet into the avatar’s pose, considering the user’s scale. We validate the effectiveness of our method in terms of deictic intention preservation and movement naturalness through quantitative comparison with alternative approaches. Additionally, we demonstrate the utility of our approach through several AR telepresence scenarios.

Download

[PAPER]

Demo video

[VIDEO]