RealDex: Towards Human-like Grasping for Robotic Dexterous Hand

Yumeng Liu1,2,*, Yaxun Yang 1,*, Youzhuo Wang 1,*, Xiaofei Wu 1, Jiamin Wang 1, Yichen Yao 1, Sören Schwertfeger1, Sibei Yang1, Wenping Wang3, Jingyi Yu1, Xuming He1, Yuexin Ma1,†,
1ShanghaiTech University, 2The University of Hong Kong, 3Texas A&M University
IJCAI 2024

*Indicates Equal Contribution

Abstract

In this paper, we introduce RealDex, a pioneering dataset capturing authentic dexterous hand grasping motions infused with human behavioral patterns, enriched by multi-view and multimodal visual data. Utilizing a teleoperation system, we seamlessly synchronize human-robot hand poses in real time. This collection of human-like motions is crucial for training dexterous hands to mimic human movements more naturally and precisely. RealDex holds immense promise in advancing humanoid robot for automated perception, cognition, and manipulation in real-world scenarios. Moreover, we introduce a cutting-edge dexterous grasping motion generation framework, which aligns with human experience and enhances real-world applicability through effectively utilizing Multimodal Large Language Models. Extensive experiments have demonstrated the superior performance of our method on RealDex and other open datasets.

Video Presentation

BibTeX


        @misc{liu2024realdex,
          title={RealDex: Towards Human-like Grasping for Robotic Dexterous Hand}, 
          author={Yumeng Liu and Yaxun Yang and Youzhuo Wang and Xiaofei Wu and Jiamin Wang and Yichen Yao and Sören Schwertfeger and Sibei Yang and Wenping Wang and Jingyi Yu and Xuming He and Yuexin Ma},
          year={2024},
          eprint={2402.13853},
          archivePrefix={arXiv},
          primaryClass={cs.RO}
        }