Towards Affordance-Aware Robotic Dexterous Grasping with Human-like Priors

1Wuhan University, 2DAMO Academy, Alibaba Group, 3Hupan Lab, 4Zhejiang University, 5Tsinghua University

Abstract

A dexterous hand capable of generalizable grasping objects is fundamental for the development of general-purpose embodied AI. However, previous methods focus narrowly on low-level grasp stability metrics, neglecting affordanceaware positioning and human-like poses which are crucial for downstream manipulation. To address these limitations, we propose AffordDex, a novel framework with two-stage training that learns a universal grasping policy with an inherent understanding of both motion priors and object affordances. In the first stage, a trajectory imitator is pretrained on a large corpus of human hand motions to instill a strong prior for natural movement. In the second stage, a residual module is trained to adapt these general human-like motions to specific object instances. This refinement is critically guided by two components: our Negative Affordance-aware Segmentation (NAA) module, which identifies functionally inappropriate contact regions, and a privileged teacher-student distillation process that ensures the final vision-based policy is highly successful. Extensive experiments demonstrate that AffordDex not only achieves universal dexterous grasping but also remains remarkably humanlike in posture and functionally appropriate in contact location. As a result, AffordDex significantly outperforms stateof-the-art baselines across seen objects, unseen instances, and even entirely novel categories.

Pipeline
Figure 1: Overview of the proposed pipeline.

Visualization of Grasping

BibTeX


      @article{zhao2024hfgs,
        title={Towards Affordance-Aware Robotic Dexterous Grasping with Human-like Priors},
        author = {Zhao, Haoyu and Zhuang, Linghao and Zhao, Xingyue and Zeng, Cheng and Xu, Haoran and Jiang, Yuming and Cen, Jun and Wang, Kexiang and Guo, Jiayan and Huang, Siteng and Li, Xin and Zhao, Deli and Zou, Hua},
        journal={arXiv preprint arXiv:2508.08896},
        year={2025}
      }