Learning Dexterous Deformable Object Manipulation Through Cross-Embodiment Dynamics Learning

Overall framework

Abstract

Dexterous manipulation of deformable objects remains a core challenge in robotics due to complex contact dynamics and high-dimensional control. While humans excel at such tasks, transferring these skills to robots is hindered by embodiment gaps. In this work, we propose using particle-based dynamics models as an embodiment-agnostic interface, enabling robots to learn directly from human-object interaction data. By representing both manipulators and objects as particles, we define a shared state and action space across embodiments. Using human demonstrations, we train a graph neural network dynamics model that leverages spatial locality and equivariance to generalize across differing embodiment shapes and structures. For control, we convert embodiment-specific joint actions into particle displacements via forward kinematics, enabling model-based planning in the shared representation space. We demonstrate that our approach transfers manipulation skills from humans to both low-DoF and high-DoF robot hands, achieving real-world clay reshaping without motion retargeting, expert demonstrations, or analytical simulation.

Type
Publication
ICRA Workshop Paper