
How can we understand users’ intentions in one place and intelligently transmit their corresponding movements with the same meanings to avatars placed in different spaces? Telepresence Avatar Motion research aims to enable avatars to reproduce users’ behavioral characteristics as they interact with spaces, objects, and other people in real time. This novel approach can be applied in virtual meetings, remote collaboration, and immersive communication systems where natural expression is essential.
Our current research focuses on understanding and transmitting users’ upper-body motion intentions through neural network-based learning approaches. We are developing technologies to accurately analyze and reproduce gestures expressed through head and hand movements. In the future, we plan to extend this technology to full-body motion, enabling complete physical expression, including walking patterns, posture changes, and leg movements. By progressively expanding our deep learning methods from upper-body to full-body motion, we aim to build an intelligent virtual communication system that enables avatars to convey users’ intentions more naturally and intuitively.
Recent papers
- Real-time Translation of Upper-body Gestures to Virtual Avatars in Dissimilar Telepresence Environments, J. Kang, T. Kim, H. Kim, and S-H Lee, IEEE Transactions on Visualization and Computer Graphics (TVCG), Jun. 2025 [Project]
- Evaluating user perception toward physics-adapted avatar in remote heterogeneous spaces, T. Kim, H. Kim, J. Lee, and S-H Lee, Computers & Graphics, Feb. 2025 [Paper]
- Visual Guidance for User Placement in Avatar-Mediated Telepresence between Dissimilar Spaces, D. Yang, J. Kang, T. Kim, and S-H Lee, IEEE Transactions on Visualization and Computer Graphics (TVCG), Jan. 2024 [paper][demo]
- Real-time Retargeting of Deictic Motion to Virtual Avatars for Augmented Reality Telepresence, J. Kang, D. Yang, T. Kim, Y. Lee, and S-H Lee, IEEE ISMAR 2023, Oct. 2023 [project]
- A Mixed Reality Telepresence System for Dissimilar Spaces Using Full-Body Avatar, L. Yoon, D. Yang, C. Chung, and S-H Lee, Proceedings of ACM SIGGRAPH Asia 2020 XR, Dec. 2020 [abstract][paper][demo]
- Placement Retargeting of Virtual Avatars to Dissimilar Indoor Environments, L. Yoon, D. Yang, J. Kim, C. Chung, and S-H Lee, IEEE Transactions on Visualization and Computer Graphics (TVCG), Aug. 2020 (presented at IEEE ISMAR 2020) [project]
- Effects of Locomotion Style and Body Visibility of a Telepresence Avatar, Y. Choi, J. Lee and S-H Lee, IEEE VR, Mar. 2020 [project]
- Multi-Finger Interaction between Remote Users in Avatar-Mediated Telepresence, Y. Lee, S. Lee, and S-H Lee,
CASA 2017, in Computer Animation and Virtual Worlds, 28(3-4), May. 2017 [data] [paper] - Retargeting Human-Object Interaction to Virtual Avatars, Y. Kim, H. Park, S. Bang, and S-H Lee, ISMAR 2016, in IEEE Transactions on Visualization and Computer Graphics, 22(11), pp. 2405-2412, Nov. 2016 [project]