Exbody2:
Advanced Expressive Humanoid Whole-Body Control

Mazeyu Ji*             Xuanbin Peng*             Fangchen Liu             Jialong Li            
Ge Yang             Xuxin Cheng             Xiaolong Wang



Expressive Whole-body Mimic

Side Stepping

Waltz Dance


Walking Indoor

Walking Outdoor


Pacing Around Outdoor

Pacing Around Indoor


Dancing - Waving Arms

Dancing - Draw Circles


Crouching

Punching

Body Hook Right

Squat, stand, wave hands

Whole-body Tracking Performance








Real-time Whole-body Mimic

Realtime pose estimation code is from HybrIK




Holding Box

Simulation Results

Unitree H1

Unitree G1

Abstract

This paper enables real-world humanoid robots to maintain stability while performing expressive motions like humans do. We propose Exbody2, a generalized whole-body tracking framework that can take any reference motion inputs and control the humanoid to mimic the motion. The model is trained in simulation with Reinforcement Learning and then transferred to the real world. It decouples keypoint tracking with velocity control, and effectively leverages a privileged teacher policy to distill precise mimic skills into the target student policy, which enables high-fidelity replication of dynamic movements such as running, crouching, dancing, and other challenging motions. We present a comprehensive qualitative and quantitative analysis of crucial design factors in the paper. We conduct our experiments on two humanoid platforms and demonstrate the superiority of our approach against state-of-the-arts, providing practical guidelines to pursue the extreme of whole-body control for humanoid robots.

Team

Mazeyu Ji

Mazeyu Ji*

Xuxin Cheng

Xuanbin Peng*

Jialong Li

Fangchen Liu

Jialong Li

Jialong Li

Ge Yang

Ge Yang

Xuxin Cheng

Xuxin Cheng

Xiaolong Wang

Xiaolong Wang

BibTeX


@article{ji2024exbody2advancedexpressivehumanoid,
  title={ExBody2: Advanced Expressive Humanoid Whole-Body Control}, 
  author={Mazeyu Ji and Xuanbin Peng and Fangchen Liu and Jialong Li and Ge Yang and Xuxin Cheng and Xiaolong Wang},
  journal={arXiv preprint arXiv:2412.13196},
  year={2024},
  }