A machine learning approach for gesture recognition with a lensless smart sensor system
Normani, Niccolo; Urru, Andrea; Abraham, Lizy; Walsh, Michael; Tedesco, Salvatore; Cenedese, A.; Susto, Gian Antoino; O'Flynn, Brendan
Date:
2018-03
Copyright:
© 2018 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
Citation:
Normani, N., Urru, A., Abraham, L., Walsh, M., Tedesco, S., Cenedese, A., Susto, G. A. and O'Flynn, B. (2018) 'A machine learning approach for gesture recognition with a lensless smart sensor system', 2018 IEEE 15th International Conference on Wearable and Implantable Body Sensor Networks (BSN), Las Vegas, NV, USA, 4-7 March, pp. 136-139. doi: 10.1109/BSN.2018.8329677
Abstract:
Hand motion tracking traditionally requires highly complex and expensive systems in terms of energy and computational demands. A low-power, low-cost system could lead to a revolution in this field as it would not require complex hardware while representing an infrastructure-less ultra-miniature (~ 100μm - [1]) solution. The present paper exploits the Multiple Point Tracking algorithm developed at the Tyndall National Institute as the basic algorithm to perform a series of gesture recognition tasks. The hardware relies upon the combination of a stereoscopic vision of two novel Lensless Smart Sensors (LSS) combined with IR filters and five hand-held LEDs to track. Tracking common gestures generates a six-gestures dataset, which is then employed to train three Machine Learning models: k-Nearest Neighbors, Support Vector Machine and Random Forest. An offline analysis highlights how different LEDs' positions on the hand affect the classification accuracy. The comparison shows how the Random Forest outperforms the other two models with a classification accuracy of 90-91 %.
Show full item record