Recent advancements in the control of prosthetic hands have focused on increasing autonomy through the use of cameras and other sensory inputs. These systems aim to reduce the cognitive load on the user by automatically controlling certain degrees of freedom. In robotics, imitation learning has emerged as a promising approach for learning grasping and complex manipulation tasks while simplifying data collection. Its application to the control of prosthetic hands remains, however, largely unexplored. Bridging this gap could enhance dexterity restoration and enable prosthetic devices to operate in more unconstrained scenarios, where tasks are learned from demonstrations rather than relying on manually annotated sequences. To this end, we present HannesImitationPolicy, the application of an imitation learning-based method to control the Hannes prosthetic hand, enabling object grasping in unstructured environments. Additionally, we introduce the HannesImitationDataset comprising grasping demonstrations in table, shelf, and human-to-prosthesis handover scenarios. These data are used to train a policy and deploy it on the prosthetic hand to predict the wrist orientation and hand closure for grasping. Experimental evaluation demonstrates successful grasps across diverse objects and conditions. Finally, the policy outperforms a segmentation-based visual-servo controller in unstructured scenarios.
A dataset designed for learning control policies with the Hannes prosthetic hand via behavior cloning.
Collection | Scenario / Task | Clutter | Objects | Demo per object | Description |
---|---|---|---|---|---|
#1 | Table Grasp | ✗ | 15 YCB | 10 | The user drives the prosthesis to grasp objects from a table with a wooden-style pattern surface. |
#2 | Shelf Grasp | ✗ | 15 YCB | 10 | The user guides the prosthetic hand to grasp objects from the top of a white shelf. This scenario introduces a different visual perspective and requires distinct wrist and hand movements compared to #1, challenging the policy to adapt to different grasping angles and spatial constraints. |
#3 | Human-to-Hannes Handover | ✔ | 15 YCB | 10 | A subject hands an object over to the prosthetic hand controlled by the user. This scenario is particularly challenging due to its unstructured environment, potential object occlusions and background. |
Table Grasp showing the combined wrist pronation and flexion followed by fingers closure.
Shelf Grasp showing wrist supination followed by fingers closure.
Human-to-Hannes Handover showing moderate wrist supination followed by fingers closure.
@inproceedings{...,
title={HannesImitation: Grasping with the Hannes Prosthetic Hand via Imitation Learning},
author={...},
booktitle={...},
pages={},
year={},
}