Grasping Movements Recognition in 3D Space Using a Bayesian Approach

Diego R. Faria, Hadi Aliakbarpour, Jorge Dias

Research output: Chapter in Book/Published conference outputConference publication

Abstract

In this work we present grasping movements recognition in 3D space. We also present the idea of a database of different sensors data for different scenarios of grasping and handling tasks for our future works. Multi-sensor information for grasp tasks require sensors calibration and synchronized data with timestamp that we start to develop to share with the researches of this area. In the scenario presented in this work we are performing the grasp recognition combining 2 different types of features from the reach-to-grasp movement. Observing the reach-to-grasp movements of different subjects we perform a learning phase based on histogram using the segmentation data. Based on a learning phase is possible to recognize the grasping movements applying Bayes rule by continuous classification based on multiplicative updates of beliefs. We developed an automated system to estimate and recognize two possible types of grasping by the hand movements performed by humans that are tracked by a magnetic tracking device [9]. These reported steps are important to understand some human behaviors before the object manipulation and can be used to endow a robot with autonomous capabilities, like showing how to reach some object for manipulation or object displacement.
Original languageEnglish
Title of host publicationICAR'09, 14th International Conference on Advanced Robotics
Publication statusPublished - 2009
EventICAR 2009. International Conference on Advanced Robotics, 2009. - Munich, Germany
Duration: 22 Jun 200926 Jun 2009

Conference

ConferenceICAR 2009. International Conference on Advanced Robotics, 2009.
Country/TerritoryGermany
CityMunich
Period22/06/0926/06/09

Fingerprint

Dive into the research topics of 'Grasping Movements Recognition in 3D Space Using a Bayesian Approach'. Together they form a unique fingerprint.

Cite this