Multimodal sensor fusion for low-power wearable human motion tracking systems in sports applications

Show simple item record

dc.contributor.author Wilk, Mariusz P.
dc.contributor.author Walsh, Michael
dc.contributor.author O'Flynn, Brendan
dc.date.accessioned 2020-10-19T15:23:48Z
dc.date.available 2020-10-19T15:23:48Z
dc.date.issued 2020-10-13
dc.identifier.citation Wilk, M. P., Walsh, M. and O’Flynn, B. (2020) 'Multimodal Sensor Fusion for Low-Power Wearable Human Motion Tracking Systems in Sports Applications', IEEE Sensors Journal, doi: 10.1109/JSEN.2020.3030779 en
dc.identifier.startpage 1 en
dc.identifier.endpage 2 en
dc.identifier.issn 1558-1748
dc.identifier.uri http://hdl.handle.net/10468/10666
dc.identifier.doi 10.1109/JSEN.2020.3030779 en
dc.description.abstract This paper presents a prototype human motion tracking system for wearable sports applications. It can be particularly applicable for tracking human motion during executing certain strength training exercises, such as the barbell squat, where an inappropriate technique could result in an injury. The key novelty of the proposed system is twofold. Firstly, it is an inside-out, multimodal, motion tracker that incorporates two complementary sensor modalities, i.e. a camera and an inertial motion sensor, as well as two externally-mounted points of reference. Secondly, it incorporates a novel multimodal sensor fusion algorithm which uses the complementary nature of vision and inertial sensor modalities to perform a computationally efficient 3-Dimensional (3-D) pose detection of the wearable device. The 3-D pose is determined by fusing information about the two external reference points captured by the camera together with the orientation angles captured by the inertial motion sensor. The accuracy of the prototype was experimentally validated in laboratory conditions. The main findings are as follows. The Root Mean Square Error (RMSE) in 3-D position calculation was 36.7 mm and 13.6 mm in the static and mobile cases, respectively. Whereas the static case was aimed at determining the system’s performance at all 3-D poses within the work envelope, the mobile case was used to determine the error in tracking human motion that is involved in the barbell squat, i.e. a mainly repeated vertical motion pattern. en
dc.format.mimetype application/pdf en
dc.language.iso en en
dc.publisher Institute of Electrical and Electronics Engineers (IEEE) en
dc.relation.uri https://ieeexplore.ieee.org/document/9222148
dc.rights © 2020 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works en
dc.subject Tracking en
dc.subject Cameras en
dc.subject Multimodal sensors en
dc.subject Injuries en
dc.subject Sensor fusion en
dc.subject Vision sensors en
dc.subject Inertial Motion Sensor en
dc.subject Inside-Out Tracking en
dc.subject Pose Detection en
dc.subject Monocular Camera en
dc.subject Multimodal en
dc.subject 3-D en
dc.title Multimodal sensor fusion for low-power wearable human motion tracking systems in sports applications en
dc.type Article (peer-reviewed) en
dc.internal.authorcontactother Brendan O'Flynn, Tyndall Microsystems, University College Cork, Cork, Ireland. +353-21-490-3000 Email: brendan.oflynn@tyndall.ie en
dc.internal.availability Full text available en
dc.date.updated 2020-10-19T15:15:00Z
dc.description.version Accepted Version en
dc.internal.rssid 540821039
dc.contributor.funder Science Foundation Ireland en
dc.contributor.funder European Regional Development Fund en
dc.description.status Peer reviewed en
dc.identifier.journaltitle IEEE Sensors Journal en
dc.internal.copyrightchecked Yes
dc.internal.licenseacceptance Yes en
dc.internal.IRISemailaddress brendan.oflynn@tyndall.ie en
dc.internal.IRISemailaddress michael.walsh@tyndall.ie en
dc.internal.bibliocheck In press. Update citation. Add volume, page numbers. en
dc.relation.project info:eu-repo/grantAgreement/SFI/SFI Research Centres/13/RC/2077/IE/CONNECT: The Centre for Future Networks & Communications/ en


Files in this item

This item appears in the following Collection(s)

Show simple item record

This website uses cookies. By using this website, you consent to the use of cookies in accordance with the UCC Privacy and Cookies Statement. For more information about cookies and how you can disable them, visit our Privacy and Cookies statement