The field of humanoid robotics and virtual reality is rapidly evolving, with a focus on creating more realistic and immersive experiences. Researchers are exploring new methods for motion retargeting, locomotion, and manipulation, enabling robots to acquire complex skills and interact with their environment in a more human-like way. The use of reduced-order models, implicit kinodynamic motion retargeting, and latent conditioned loco-manipulation are some of the innovative approaches being developed. Additionally, there is a growing interest in the role of perception and vision in shaping human motion and behavior, with studies investigating the effect of prior exposure and fidelity on quality and realism perception in virtual reality. Noteworthy papers include: Implicit Kinodynamic Motion Retargeting for Human-to-humanoid Imitation Learning, which proposes a novel efficient and scalable retargeting framework, and Moving by Looking: Towards Vision-Driven Avatar Motion Generation, which presents a human avatar that solely uses egocentric vision to perceive its surroundings and navigate. Overall, these advancements have the potential to significantly improve the performance and versatility of humanoid robots and virtual reality systems, enabling more realistic and engaging interactions between humans and machines.
Advancements in Humanoid Robotics and Virtual Reality
Sources
RoMoCo: Robotic Motion Control Toolbox for Reduced-Order Model-Based Locomotion on Bipedal and Humanoid Robots
DynaFlow: Dynamics-embedded Flow Matching for Physically Consistent Motion Generation from State-only Demonstrations