Experiments on an RGB-D wearable vision system for egocentric activity recognition
Publikation: Bidrag til tidsskrift › Konferenceartikel › Forskning › fagfællebedømt
This work describes and explores novel steps towards activity recognition from an egocentric point of view. Activity recognition is a broadly studied topic in computer vision, but the unique characteristics of wearable vision systems present new challenges and opportunities. We evaluate a challenging new publicly available dataset that includes trajectories of different users across two indoor environments performing a set of more than 20 different activities. The visual features studied include compact and global image descriptors, including GIST and a novel skin segmentation based histogram signature, and state-of-the art image representations for recognition, including Bag of SIFT words and Convolutional Neural Network (CNN) based features. Our experiments show that simple and compact features provide reasonable accuracy to obtain basic activity information (in our case, manipulation vs. non-manipulation). However, for finer grained categories CNN-based features provide the most promising results. Future steps include integration of depth information with these features and temporal consistency into the pipeline.
Originalsprog | Engelsk |
---|---|
Tidsskrift | IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops |
Sider (fra-til) | 611-617 |
Antal sider | 7 |
ISSN | 2160-7508 |
DOI | |
Status | Udgivet - 24 sep. 2014 |
Eksternt udgivet | Ja |
Begivenhed | 2014 IEEE Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2014 - Columbus, USA Varighed: 23 jun. 2014 → 28 jun. 2014 |
Konference
Konference | 2014 IEEE Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2014 |
---|---|
Land | USA |
By | Columbus |
Periode | 23/06/2014 → 28/06/2014 |
Bibliografisk note
Publisher Copyright:
© 2014 IEEE.
ID: 302044204