logo
banner

Journals & Publications

Publications Papers

Papers

EgoGesture: A New Dataset and Benchmark for Egocentric Hand Gesture Recognition
Jul 10, 2018Author:
PrintText Size A A

Title: EgoGesture: A New Dataset and Benchmark for Egocentric Hand Gesture Recognition

Authors: Zhang, YF; Cao, CQ; Cheng, J; Lu, HQ

Author Full Names: Zhang, Yifan; Cao, Congqi; Cheng, Jian; Lu, Hanqing

Source: IEEE TRANSACTIONS ON MULTIMEDIA, 20 (5):1038-1050; 10.1109/TMM.2018.2808769 MAY 2018

Language: English

Abstract: Gesture is a natural interface in human-computer interaction, especially interacting with wearable devices, such as VR/AR helmet and glasses. However, in the gesture recognition community, it lacks of suitable datasets for developing egocentric (first-person view) gesture recognition methods, in particular in the deep learning era. In this paper, we introduce a new benchmark dataset named EgoGesture with sufficient size, variation, and reality to be able to train deep neural networks. This dataset contains more than 24 000 gesture samples and 3 000 000 frames for both color and depth modalities from 50 distinct subjects. We design 83 different static and dynamic gestures focused on interaction with wearable devices and collect them from six diverse indoor and outdoor scenes, respectively, with variation in background and illumination. We also consider the scenario when people perform gestures while they are walking. The performances of several representative approaches are systematically evaluated on two tasks: gesture classification in segmented data and gesture spotting and recognition in continuous data. Our empirical study also provides an in-depth analysis on input modality selection and domain adaptation between different scenes.

ISSN: 1520-9210

eISSN: 1941-0077

IDS Number: GD7YD

 Unique ID: WOS:000430728400002

*Click Here to View Full Record