Experiments on an RGB-D wearable vision system for egocentric activity recognition

Research output: Contribution to journalConference articleResearchpeer-review

This work describes and explores novel steps towards activity recognition from an egocentric point of view. Activity recognition is a broadly studied topic in computer vision, but the unique characteristics of wearable vision systems present new challenges and opportunities. We evaluate a challenging new publicly available dataset that includes trajectories of different users across two indoor environments performing a set of more than 20 different activities. The visual features studied include compact and global image descriptors, including GIST and a novel skin segmentation based histogram signature, and state-of-the art image representations for recognition, including Bag of SIFT words and Convolutional Neural Network (CNN) based features. Our experiments show that simple and compact features provide reasonable accuracy to obtain basic activity information (in our case, manipulation vs. non-manipulation). However, for finer grained categories CNN-based features provide the most promising results. Future steps include integration of depth information with these features and temporal consistency into the pipeline.

Original languageEnglish
JournalIEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops
Pages (from-to)611-617
Number of pages7
ISSN2160-7508
DOIs
Publication statusPublished - 24 Sep 2014
Externally publishedYes
Event2014 IEEE Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2014 - Columbus, United States
Duration: 23 Jun 201428 Jun 2014

Conference

Conference2014 IEEE Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2014
CountryUnited States
CityColumbus
Period23/06/201428/06/2014

Bibliographical note

Publisher Copyright:
© 2014 IEEE.

ID: 302044204