CNNs for heart rate estimation and human activity recognition in wrist worn sensing applications

Loading...
Thumbnail Image
Files
EB_CameraReady_PerCom.pdf(408.82 KB)
Accepted version
Date
2020-03
Authors
Brophy, Eoin
Muehlhausen, Willie
Smeaton, Alan F.
Ward, Tomás E.
Journal Title
Journal ISSN
Volume Title
Publisher
Institute of Electrical and Electronics Engineers (IEEE)
Research Projects
Organizational Units
Journal Issue
Abstract
Wrist-worn smart devices are providing increased insights into human health, behaviour and performance through sophisticated analytics. However, battery life, device cost and sensor performance in the face of movement-related artefact present challenges which must be further addressed to see effective applications and wider adoption through commoditisation of the technology. We address these challenges by demonstrating, through using a simple optical measurement, photoplethysmography (PPG) used conventionally for heart rate detection in wrist-worn sensors, that we can provide improved heart rate and human activity recognition (HAR) simultaneously at low sample rates, without an inertial measurement unit. This simplifies hardware design and reduces costs and power budgets. We apply two deep learning pipelines, one for human activity recognition and one for heart rate estimation. HAR is achieved through the application of a visual classification approach, capable of robust performance at low sample rates. Here, transfer learning is leveraged to retrain a convolutional neural network (CNN) to distinguish characteristics of the PPG during different human activities. For heart rate estimation we use a CNN adopted for regression which maps noisy optical signals to heart rate estimates. In both cases, comparisons are made with leading conventional approaches. Our results demonstrate a low sampling frequency can achieve good performance without significant degradation of accuracy. 5 Hz and 10 Hz were shown to have 80.2% and 83.0% classification accuracy for HAR respectively. These same sampling frequencies also yielded a robust heart rate estimation which was comparative with that achieved at the more energy-intensive rate of 256 Hz.
Description
Keywords
Deep learning , Transfer learning , Photoplethysmography , Heart rate , Estimation , Activity recognition , Feature extraction , Monitoring , Training , Machine learning
Citation
Brophy, E., Muehlhausen, W., Smeaton, A. F. and Ward, T. E. (2020) 'CNNs for Heart Rate Estimation and Human Activity Recognition in Wrist Worn Sensing Applications'. 2020 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops), Austin, TX, USA, 23-27 March, pp. 1-6. doi: 10.1109/PerComWorkshops48775.2020.9156120
Copyright
© 2020 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works