Objectives:
- Acquisition of datasets to the HANDLE project database in order to the HANDLE partners implement learning-by-demonstration approaches to endow the SHADOW robotic hand and arm with equivalent skills.
- The datasets consist in the observation of the human demonstrations of dexterous manipulation of laboratory pipettes from two different perspectives captured by two monocular RGB cameras.
- The datasets need to be manually annotated frame by frame by several partners of the HANDLE project.
- Fast visualization of the contents of the datasets.
Outputs:
- 240 datasets: 12 different people performing 4 dexterous manipulation tasks (Scenario 1: Pipette - Pick Stand, Place Stand; Scenario 2: Pipette - Pick Stand, Place Stand - No thumb; Scenario 3: Pipette - Pick Stand, Place Table; Scenario 4: Pipette - Pick Stand, Place Table - No thumb) using a laboratory pipette to move liquids between two containers. - Request access to HANDLE project database here.
- MATLAB annotation (frame-by-frame) tool. Annotation dictionary available in the interface. Export the annotation in plain text or XML format. MATLAB code available here. Demonstration video:
Fast preview video for each dataset. Synchronized views from two different monocular RGB cameras. MATLAB code available here. Demonstration video:
- Robotic platforms performing the dexterous manipulation of a laboratpry pipette. Dissemination videos of HANDLE project (Euronews, European Commission, ...):