Data fusion for human motion tracking with multimodal sensing

Show simple item record

dc.contributor.advisor O'Flynn, Brendan en
dc.contributor.advisor Walsh, Michael en Wilk, Mariusz P. 2020-06-02T12:35:31Z 2020-06-02T12:35:31Z 2020-05-07 2020-05-07
dc.identifier.citation Wilk, M. P. 2020. Data fusion for human motion tracking with multimodal sensing. PhD Thesis, University College Cork. en
dc.identifier.endpage 128 en
dc.description.abstract Multimodal sensor fusion is a common approach in the design of many motion tracking systems. It is based on using more than one sensor modality to measure different aspects of a phenomenon and capture more information about it than what would be available otherwise from a single sensor. Multimodal sensor fusion algorithms often leverage the complementary nature of the different modalities to compensate for shortcomings of the individual sensor modalities. This approach is particularly suitable for low-cost and highly miniaturised wearable human motion tracking systems that are expected to perform their function with limited resources at their disposal (energy, processing power, etc.). Opto-inertial motion trackers are some of the most commonly used approaches in this context. These trackers fuse the sensor data from vision and Inertial Motion Unit (IMU) sensors to determine the 3-Dimensional (3-D) pose of the given body part, i.e. its position and orientation. The continuous advances in the State-Of-the-Art (SOA) in camera miniaturisation and efficient point detection algorithms along with the more robust IMUs and increasing processing power in a shrinking form factor, make it increasingly feasible to develop a low-cost, low-power, and highly miniaturised wearable smart sensor human motion tracking system. It incorporates these two sensor modalities. In this thesis, a multimodal human motion tracking system is presented that builds on these developments. The proposed system consists of a wearable smart sensor system, referred to as Wearable Platform (WP), which incorporates the two sensor modalities, i.e. monocular camera (optical) and IMU (motion). The WP operates in conjunction with two optical points of reference embedded in the ambient environment to enable positional tracking in that environment. In addition, a novel multimodal sensor fusion algorithm is proposed which uses the complementary nature of the vision and IMU sensors in conjunction with the two points of reference in the ambient environment, to determine the 3-D pose of the WP in a novel and computationally efficient way. To this end, the WP uses a low-resolution camera to track two points of reference; specifically two Infrared (IR) LEDs embedded in the wall. The geometry that is formed between the WP and the IR LEDs, when complemented by the angular rotation measured by the IMU, simplifies the mathematical formulations involved in the computing the 3-D pose, making them compatible with the resource-constrained microprocessors used in such wearable systems. Furthermore, the WP is coupled with the two IR LEDs via a radio link to control their intensity in real-time. This enables the novel subpixel point detection algorithm to maintain its highest accuracy, thus increasing the overall precision of the pose detection algorithm. The resulting 3-D pose can be used as an input to a higher-level system for further use. One of the potential uses for the proposed system is in sports applications. For instance, it could be particularly useful for tracking the correctness of executing certain exercises in Strength Training (ST) routines, such as the barbell squat. Thus, it can be used to assist professional ST coaches in remotely tracking the progress of their clients, and most importantly ensure a minimum risk of injury through real-time feedback. Despite its numerous benefits, the modern lifestyle has a negative impact on our health due to an increasingly sedentary lifestyle that it involves. The human body has evolved to be physically active. Thus, these lifestyle changes need to be offset by the addition of regular physical activity to everyday life, of which ST is an important element. This work describes the following novel contributions: • A new multimodal sensor fusion algorithm for 3-D pose detection with reduced mathematical complexity for resource-constrained platforms • A novel system architecture for efficient 3-D pose detection for human motion tracking applications • A new subpixel point detection algorithm for efficient and precise point detection at reduced camera resolution • A new reference point estimation algorithm for finding locations of reference points used in validating subpixel point detection algorithms • A novel proof-of-concept demonstrator prototype that implements the proposed system architecture and multimodal sensor fusion algorithm en
dc.format.mimetype application/pdf en
dc.language.iso en en
dc.publisher University College Cork en
dc.rights © 2020, Mariusz P. Wilk. en
dc.rights.uri en
dc.subject Sensor fusion en
dc.subject Multimodal en
dc.subject Motion tracking en
dc.subject Low power en
dc.subject Wearable en
dc.title Data fusion for human motion tracking with multimodal sensing en
dc.type Doctoral thesis en
dc.type.qualificationlevel Doctoral en
dc.type.qualificationname PhD - Doctor of Philosophy en
dc.internal.availability Full text not available en
dc.description.version Accepted Version en
dc.contributor.funder Science Foundation Ireland en
dc.contributor.funder European Regional Development Fund en
dc.description.status Not peer reviewed en Electrical and Electronic Engineering en
dc.internal.conferring Summer 2020 en
dc.internal.ricu Tyndall National Institute en
dc.relation.project info:eu-repo/grantAgreement/SFI/SFI Research Centres/13/RC/2077/IE/CONNECT: The Centre for Future Networks & Communications/ en
dc.availability.bitstream embargoed 2021-05-09

Files in this item

This item appears in the following Collection(s)

Show simple item record

© 2020, Mariusz P. Wilk. Except where otherwise noted, this item's license is described as © 2020, Mariusz P. Wilk.
This website uses cookies. By using this website, you consent to the use of cookies in accordance with the UCC Privacy and Cookies Statement. For more information about cookies and how you can disable them, visit our Privacy and Cookies statement