Action gist based automatic segmentation for periodic in-hand manipulation movement learning

Objectives:

- Dataset acquisition performed to the HANDLE project partner: University of Hamburg - Technical Aspects of Multimodal Systems (TAMS) Lab.

- To develop a software tool to automatically segment dexterous in-hand manipulation movements. The automatic segmentation will be compared (benchmark) to the segmentation performed manually by a human operator.

Outputs:

- Datasets of a human demonstrator performing an in-hand dexterous manipulation tasks: screw driver in-hand rotation (periodic movement). The hand of the human subject was equipped with a Cyberglove and Tekscan tactile sensing grasp array. The demonstrations were also recorded using a monocular RGB camera. - Request access to HANDLE project database here. Sample dataset video:

- Publication by the HANDLE partner University of Hamburg - Technical Aspects of Multimodal Systems (TAMS) Lab.: Gang Cheng; Hendrich, N.; Jianwei Zhang, "Action gist based automatic segmentation for periodic in-hand manipulation movement learning," Intelligent Robots and Systems (IROS), 2012 IEEE/RSJ International Conference on , vol., no., pp.4768-4775, 7-12 Oct. 2012. [PDF]

- PhD thesis by HANDLE researcher Gang Cheng: Gang Cheng, " State-Action Gist based In-hand Manipulation Learning from Human Demonstration", PhD Thesis, University of Hamburg, 2013. [PDF]

Scholarly Lite is a free theme, contributed to the Drupal Community by More than Themes.