Identifying Objects from Hand Configurations during In-hand Exploration

Diego R. Faria, Jorge Lobo, Jorge Dias

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In this work we use hand configuration and contact points during in-hand object exploration to identify the manipulated objects. Different contact points associated to an object shape can be represented in a latent space and lie on a lower dimensional non-linear manifold in the contact points space which is suitable for modelling and recognition. Associating and learning hand configurations to specific objects by means of Gaussian mixture models, later by identifying the hand configuration during the in-hand object exploration we can generate hypotheses of candidate objects to be identified. This process selects a set of the most probable objects from a database. The accumulated set of contact points (partial volume of the object shape) during the object in-hand exploration is matched to the set selected from the database (most probable candidate objects). Results are presented for human manipulation of objects, but this can also be applied to artificial hands, although we have not addressed the hand control, only the object identification.
Original languageEnglish
Title of host publicationIEEE International Conference on Multisensor Fusion and Information Integration (IEEE MFI)
PublisherIEEE
Pages132-137
Number of pages6
DOIs
Publication statusPublished - 12 Nov 2012

Fingerprint Dive into the research topics of 'Identifying Objects from Hand Configurations during In-hand Exploration'. Together they form a unique fingerprint.

  • Cite this

    Faria, D. R., Lobo, J., & Dias, J. (2012). Identifying Objects from Hand Configurations during In-hand Exploration. In IEEE International Conference on Multisensor Fusion and Information Integration (IEEE MFI) (pp. 132-137). IEEE. https://doi.org/10.1109/MFI.2012.6343033