In this paper, we introduce RealDex, a pioneering dataset capturing authentic dexterous hand grasping motions infused with human behavioral patterns, enriched by multi-view and multimodal visual data. Utilizing a teleoperation system, we seamlessly synchronize human-robot hand poses in real time. This collection of human-like motions is crucial for training dexterous hands to mimic human movements more naturally and precisely. RealDex holds immense promise in advancing humanoid robot for automated perception, cognition, and manipulation in real-world scenarios. Moreover, we introduce a cutting-edge dexterous grasping motion generation framework, which aligns with human experience and enhances real-world applicability through effectively utilizing Multimodal Large Language Models. Extensive experiments have demonstrated the superior performance of our method on RealDex and other open datasets.
@article{liu2024realdex,
title={Realdex: Towards human-like grasping for robotic dexterous hand},
author={Liu, Yumeng and Yang, Yaxun and Wang, Youzhuo and Wu, Xiaofei and Wang, Jiamin and Yao, Yichen and Schwertfeger, S{\"o}ren and Yang, Sibei and Wang, Wenping and Yu, Jingyi and others},
journal={arXiv preprint arXiv:2402.13853},
year={2024}
}