The field of human-machine interaction and robotics is moving towards more intuitive and capable systems. Researchers are exploring new approaches to generate human-like trajectories, develop more accurate and interpretable visual functional affordance grounding, and create low-cost robot manipulators with near industrial-grade performance. There is also a growing interest in whole-body manipulation, with a focus on designing safe robotic hardware, developing intuitive teleoperation interfaces, and creating algorithms that can learn from human demonstrations. Furthermore, researchers are investigating ways to prolong tool life through lifespan-guided reinforcement learning and to improve mobile manipulation with active inference for long-horizon rearrangement tasks. Noteworthy papers in this area include:
- CRAFT: A Neuro-Symbolic Framework for Visual Functional Affordance Grounding, which introduces a framework for interpretable affordance grounding that integrates structured commonsense priors with visual evidence.
- Astribot Suite, a robot learning suite for whole-body manipulation that demonstrates the effectiveness of a unified framework for whole-body coordination and manipulation.
- Prolonging Tool Life: Learning Skillful Use of General-purpose Tools through Lifespan-guided Reinforcement Learning, which introduces a reinforcement learning framework that incorporates tool lifespan as a factor during policy optimization.