A human activity recognition framework using max-min features and key poses with differential evolution random forests classifier

Urbano Miguel Nunes*, Diego R. Faria, Paulo Peixoto

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

This paper presents a novel framework for human daily activity recognition that is intended to rely on few training examples evidencing fast training times, making it suitable for real-time applications. The proposed framework starts with a feature extraction stage, where the division of each activity into actions of variable-size, based on key poses, is performed. Each action window is delimited by two consecutive and automatically identified key poses, where static (i.e. geometrical) and max-min dynamic (i.e. temporal) features are extracted. These features are first used to train a random forest (RF) classifier which was tested using the CAD-60 dataset, obtaining relevant overall average results. Then in a second stage, an extension of the RF is proposed, where the differential evolution meta-heuristic algorithm is used, as splitting node methodology. The main advantage of its inclusion is the fact that the differential evolution random forest has no thresholds to tune, but rather a few adjustable parameters with well-defined behavior.
Original languageEnglish
Pages (from-to)21-31
Number of pages11
JournalPattern Recognition Letters
Volume99
Early online date2 May 2017
DOIs
Publication statusPublished - 1 Nov 2017

Bibliographical note

© 2017, Elsevier. Licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International http://creativecommons.org/licenses/by-nc-nd/4.0/

Keywords

  • human activity recognition
  • max-min features
  • key pose
  • random forests
  • differential evolution

Fingerprint

Dive into the research topics of 'A human activity recognition framework using max-min features and key poses with differential evolution random forests classifier'. Together they form a unique fingerprint.

Cite this