In this paper, we introduce RealDex, a pioneering dataset capturing authentic dexterous hand grasping motions infused with human behavioral patterns, enriched by multi-view and multimodal visual data. Utilizing a teleoperation system, we seamlessly synchronize human-robot hand poses in real time. This collection of human-like motions is crucial for training dexterous hands to mimic human movements more naturally and precisely. RealDex holds immense promise in advancing humanoid robot for automated perception, cognition, and manipulation in real-world scenarios. Moreover, we introduce a cutting-edge dexterous grasping motion generation framework, which aligns with human experience and enhances real-world applicability through effectively utilizing Multimodal Large Language Models. Extensive experiments have demonstrated the superior performance of our method on RealDex and other open datasets.
@misc{liu2024realdex,
title={RealDex: Towards Human-like Grasping for Robotic Dexterous Hand},
author={Yumeng Liu and Yaxun Yang and Youzhuo Wang and Xiaofei Wu and Jiamin Wang and Yichen Yao and Sören Schwertfeger and Sibei Yang and Wenping Wang and Jingyi Yu and Xuming He and Yuexin Ma},
year={2024},
eprint={2402.13853},
archivePrefix={arXiv},
primaryClass={cs.RO}
}