Research

Telepresence Avatar 

We investigate the AR telepresence avatar technology that allows geographically separated people to meet and communicate through their avatars. People are living in vastly diverse indoor spaces with varying sizes and furniture configurations. Thus, the avatars sent to a different space need to move creatively to preserve the meaning of the users’ movement performed in the original space. This need gives rise to various research problems: How to determine the position and direction of the avatar to enable natural communication between the users? How should the avatar move to allow for contact interaction like handshake between the avatar and the user? If the users use an object (e.g., a chair or desk), how should their avatars choose a suitable corresponding object in a different space and move to use the object? How can we enable people from different places to meet up and have a meeting or a party? Some of thses research questions are also important in the metaverse or VR environments. To make solutions to these critical questions, we analyze and model human behavioral characteristics of interacting with the space, objects, and other people, and develop computational methods to make the avatar reproduce these characteristics.

Recent Papers

  • A Mixed Reality Telepresence System for Dissimilar Spaces Using Full-Body Avatar, L. Yoon, D. Yang, C. Chung, and S-H Lee, Proceedings of ACM SIGGRAPH Asia 2020 XR, Dec. 2020 [abstract][paper][demo]
  • Placement Retargeting of Virtual Avatars to Dissimilar Indoor Environments, L. Yoon, D. Yang, J. Kim, C. Chung, and S-H Lee, IEEE Transactions on Visualization and Computer Graphics (TVCG), August. 2020 (presented at IEEE ISMAR 2020) [project]
  • Effects of Locomotion Style and Body Visibility of a Telepresence Avatar, Y. Choi, J. Lee and S-H Lee, IEEE VR, Mar. 2020 [project]
  • Multi-Finger Interaction between Remote Users in Avatar-Mediated Telepresence, Y. Lee, S. Lee, and S-H Lee,
    CASA 2017, in Computer Animation and Virtual Worlds, 28(3-4), May 2017 [data] [paper]
  • Retargeting Human-Object Interaction to Virtual Avatars, Y. Kim, H. Park, S. Bang, and S-H Lee,  ISMAR 2016, in IEEE Transactions on Visualization and Computer Graphics, 22(11), pp. 2405-2412, Nov. 2016 [project]
  • Hand Contact between Remote Users through Virtual Avatars, J. Oh, Y. Lee, Y. Kim, S. Lee, and S-H Lee, CASA2016, pp.1-4, May 2016 [paper]

 

Avatar Modeling

Generating realistic human models and their simulation has been one of the key topics in computer graphics research. KAIST Motion Computing Laboratory is researching to generate realistic avatar models that resemble the users. For this, we have developed a method to estimate parameters for the deformable tissue model of a human (e.g., muscle and flesh) to replicate a person’s deformation characteristics during dynamic motions. We also have developed a method to recognize the user’s hairstyle and apply a suitable simulation method for efficient and realistic hair simulation. Currently, we are developing deep learning approaches to creating 3D garment models instantly and simulating them in real-time. We aim to develop methods to generate a hyper-realistic avatar that resembles the users’ appearance and their distinctive way of movement.

Recent Papers

  • Estimating Garment Patterns from Static Scan Data, S. Bang, M. Korosteleva, and S-H Lee, Computer Graphics Forum (accepted), May 2021  [project]
  • Spline Interface for Intuitive Skinning Weight Editing, S. Bang and S-H Lee, ACM Transactions on Graphics (TOG), 37(5), 2018  [project]
  • Hair Modeling and Simulation by Style, S. Jung and S-H Lee, Eurographics 2018 Conference, in Computer Graphics Forum, 37(2), 355-363, May 2018 [project]
  • Regression-Based Landmark Detection on Dynamic Human Models, D-K Jang and S-H Lee, Pacific Graphics 2017 Conference, in Computer Graphics Forum, 36(7), 73-82, Oct. 2017 [project]
  • Data-Driven Physics for Human Soft Tissue Animation, M. Kim, G. Pons-Moll, S. Pujades, S. Bang, J. Kim, M. J. Black, and S-H Lee, ACM Transactions on Graphics, (Proc. SIGGRAPH), 36(4), 54:1-12, 2017 [project]

 

Generating Human Motions

AvatarGenerating the natural movement of virtual characters and avatars is a crucial technology for computer games, animation, VR/AR, and intelligent agents. Conventionally, character motions have been created with labor and time-intensive methods such as the keyframe technique. To enhance the quality and efficiency of the motion generation process, researchers have developed various advanced techniques based on machine learning, physics simulation, control theory, and biomechanics.  Our laboratory has developed novel methods to improve the quality of the motion capture data, create a character’s balance keeping behavior, and move in a complex environment with multiple body parts. Currently, we are researching to enable the characters to move naturally in complex real-world-like environments with various types of furniture and objects. Another topic we focus on is motion stylization, which controls how a movement is taken to reflect various contexts such as emotion, personality, and situation.

Recent Papers

  • LoBSTr: Real-time Lower-body Pose Prediction from Sparse Upper-body Tracking Signals, D. Yang, D. Kim, and S-H Lee, Eurographics 2021 Conference, in Computer Graphics Forum, May. 2021 [project]
  • Constructing Human Motion Manifold with Sequential Networks, D-K Jang and S-H Lee, Computer Graphics Forum, May. 2020 [project]
  • Projective Motion Correction with Contact Optimization, S. Lee and S-H Lee, IEEE Transactions on Visualization and Computer Graphics (TVCG), Mar. 2018 [project]
  • Aura Mesh: Motion Retargeting to Preserve the Spatial Relationships between Skinned Characters, T. Jin, M. Kim, and S-H Lee, Eurographics 2018 Conference, in Computer Graphics Forum, 37(2), 311-320, May 2018 [project]
  • Multi-Contact Locomotion Using a Contact Graph with Feasibility Predictors, C. Kang and S-H Lee, ACM Transactions on Graphics, 36(2), 22:1-14, April, 2017 (to be presented at SIGGRAPH 2017[project]
  • Realistic Biomechanical Simulation and Control of Human Swimming, W. Si, S-H Lee, E. Sifakis, and D. Terzopoulos, ACM Transactions on Graphics, (presented at ACM SIGGRAPH 2015) 34(1), 10:pp.1-15, Nov. 2014 [project]

 

Video Introduction to KAIST LAVA (previously Motion Computing) Lab. (Korean)