Touch attention Bayesian models for object feature extraction in robotic blind manipulation

New Neurocomputing journal paper about touch attention mechanisms in robotic systems. Read more here.

Title:

Touch attention Bayesian models for object feature extraction in robotic blind manipulation

Authors:

Ricardo Martins [ website | social ]

Joao Filipe Ferreira [ website | social ]

Jorge Dias [ website | social ]

Abstract:

Nowadays, robotic platforms tend to be equipped with a conjugation of multi-modal artificial perception systems to navigate and interact with the surrounding environment and persons. The complexity and dynamic characteristics of those environments has led to the development of attention mechanisms to filter the sensory overload to sense, perceive and process only the relevant sensory data.This work presents Bayesian models related with the attentional mechanisms involved in blind manipulation of objects and related with the detection of borders, borders following and corner detection of object surfaces. This type of object features can be used as structural references of the manipulated object and be used in the next manipulation stages. The perception of those stimuli requires coordination between the attention, perception and action mechanisms in order to direct (attention) and promote (action) the contact between the touch sensors and the object, to maximize the acquisition of information (perception) and uncertainty reduction. The Bayesian model integrates information related with the surface curvature, fingers motion direction and surface texture. The magnitude of the variation of those object properties characterizes transition regions between between the object and the surrounding environment. The detection of these transitions allow the robotic hand to estimate an initial location of the borders of the object and start the board following using the robotic fingers and recognize the interception between borders - the corners. The statistical temporal relations between the action primitives required to explore and manipulate the object are based in an offline learning period.The action plan is inferred from this learned knowledge.The models of this work have been tested in a scenario that involved the blind manipulation of a towel and the detection of its corners. This scenario was implemented in ROS-Gazebo simulation environment, using the Shadow dexterous robotic hand model equipped with the Shadow tactile sensor models attached to the fingertips of the robotic hand.

Publisher:

32nd International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering, MaxEnt 2012

Date Published:

15-07-2012

Alternative full-text PDF:

download full-text PDF via University of Coimbra

Keywords:

robotics, touch attention, tactile attention, artificial perception, Bayesian modelling, path planning, haptic exploration, probabilistic grid maps

Thumbnail:

image

Touch attention Bayesian models for object feature extraction in robotic blind manipulation

Scholarly Lite is a free theme, contributed to the Drupal Community by More than Themes.