A machine learning approach for gesture recognition with a lensless smart sensor system

Show simple item record

dc.contributor.author Normani, Niccolo
dc.contributor.author Urru, Andrea
dc.contributor.author Abraham, Lizy
dc.contributor.author Walsh, Michael
dc.contributor.author Tedesco, Salvatore
dc.contributor.author Cenedese, A.
dc.contributor.author Susto, Gian Antoino
dc.contributor.author O'Flynn, Brendan
dc.date.accessioned 2018-10-12T14:59:23Z
dc.date.available 2018-10-12T14:59:23Z
dc.date.issued 2018-03
dc.identifier.citation Normani, N., Urru, A., Abraham, L., Walsh, M., Tedesco, S., Cenedese, A., Susto, G. A. and O'Flynn, B. (2018) 'A machine learning approach for gesture recognition with a lensless smart sensor system', 2018 IEEE 15th International Conference on Wearable and Implantable Body Sensor Networks (BSN), Las Vegas, NV, USA, 4-7 March, pp. 136-139. doi: 10.1109/BSN.2018.8329677 en
dc.identifier.startpage 136 en
dc.identifier.endpage 139 en
dc.identifier.isbn 978-1-5386-1109-8
dc.identifier.isbn 978-1-5386-1110-4
dc.identifier.uri http://hdl.handle.net/10468/7008
dc.identifier.doi 10.1109/BSN.2018.8329677
dc.description.abstract Hand motion tracking traditionally requires highly complex and expensive systems in terms of energy and computational demands. A low-power, low-cost system could lead to a revolution in this field as it would not require complex hardware while representing an infrastructure-less ultra-miniature (~ 100μm - [1]) solution. The present paper exploits the Multiple Point Tracking algorithm developed at the Tyndall National Institute as the basic algorithm to perform a series of gesture recognition tasks. The hardware relies upon the combination of a stereoscopic vision of two novel Lensless Smart Sensors (LSS) combined with IR filters and five hand-held LEDs to track. Tracking common gestures generates a six-gestures dataset, which is then employed to train three Machine Learning models: k-Nearest Neighbors, Support Vector Machine and Random Forest. An offline analysis highlights how different LEDs' positions on the hand affect the classification accuracy. The comparison shows how the Random Forest outperforms the other two models with a classification accuracy of 90-91 %. en
dc.description.sponsorship 13/RC/2077 en
dc.format.mimetype application/pdf en
dc.language.iso en en
dc.relation.ispartof 2018 IEEE 15th International Conference on Wearable and Implantable Body Sensor Networks (BSN)
dc.relation.uri https://ieeexplore.ieee.org/document/8329677
dc.rights © 2018 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. en
dc.subject Computer vision en
dc.subject Digital filters en
dc.subject Gesture recognition en
dc.subject Image motion analysis en
dc.subject Intelligent sensors en
dc.subject Learning (artificial intelligence) en
dc.subject Light emitting diodes en
dc.subject Object tracking en
dc.subject Stereo image processing en
dc.subject Support vector machines en
dc.subject Random Forest en
dc.subject Machine learning approach en
dc.subject Lensless smart sensor system en
dc.subject Hand motion tracking en
dc.subject Computational demands en
dc.subject Low-cost system en
dc.subject Complex hardware en
dc.subject Multiple Point Tracking algorithm en
dc.subject Tyndall National Institute en
dc.subject Gesture recognition tasks en
dc.subject Stereoscopic vision en
dc.subject IR filters en
dc.subject Tracking common gestures en
dc.subject Six-gestures dataset en
dc.subject Machine Learning models en
dc.subject Support Vector Machine en
dc.subject Low-power system en
dc.subject Infrastructure-less ultra-miniature solution en
dc.subject Hand-held LED en
dc.subject k-Nearest Neighbors en
dc.subject Sensors en
dc.subject Hardware en
dc.subject Radio frequency en
dc.subject Calibration en
dc.subject Lensless Smart Sensor en
dc.subject Machine Learning en
dc.title A machine learning approach for gesture recognition with a lensless smart sensor system en
dc.type Conference item en
dc.internal.authorcontactother Brendan O'Flynn, Tyndall Microsystems, University College Cork, Cork, Ireland. +353-21-490-3000 Email: brendan.oflynn@tyndall.ie en
dc.internal.availability Full text available en
dc.date.updated 2018-10-12T14:53:34Z
dc.description.version Accepted Version en
dc.internal.rssid 457613095
dc.contributor.funder Science Foundation Ireland en
dc.contributor.funder European Regional Development Fund en
dc.description.status Peer reviewed en
dc.internal.copyrightchecked No !!CORA!! en
dc.internal.licenseacceptance Yes en
dc.internal.conferencelocation Las Vegas, Nevada, USA en
dc.internal.IRISemailaddress brendan.oflynn@tyndall.ie en
dc.relation.project info:eu-repo/grantAgreement/SFI/SFI Research Centres/13/RC/2077/IE/CONNECT: The Centre for Future Networks & Communications/ en


Files in this item

This item appears in the following Collection(s)

Show simple item record

This website uses cookies. By using this website, you consent to the use of cookies in accordance with the UCC Privacy and Cookies Statement. For more information about cookies and how you can disable them, visit our Privacy and Cookies statement