This paper introduces a hand gesture recognition sensor using ultra-wideband impulse signals, which are reflected from a hand. The reflected waveforms in time domain are determined by the reflection surface of a target. Thus every gesture has its own reflected waveform. Thus we propose to use machine learning, such as convolutional neural network (CNN) for the gesture classification. The CNN extracts its own feature and constructs classification model then classifies the reflected waveforms. Six hand gestures from american sign language (ASL) are used for an experiment and the result shows more than 90% recognition accuracy. For fine movements, a rotating plaster model is measured with 10° step. An average recognition accuracy is also above 90%.
Bibliographical notePublisher Copyright:
© 2001-2012 IEEE.
All Science Journal Classification (ASJC) codes
- Electrical and Electronic Engineering