DP-MTFL: differentially private multi-tier federated learning for IoT applications

Loading...
Thumbnail Image
Date
2024-07-24
Authors
Soleimani, Ramin
Pesch, Dirk
Journal Title
Journal ISSN
Volume Title
Publisher
Institute of Electrical and Electronics Engineers (IEEE)
Research Projects
Organizational Units
Journal Issue
Abstract
Differentially Private Federated Learning (DP-FL) is a privacy-preserving machine learning paradigm. Building on a standard DP-FL approach, we introduce and implement a novel Differentially Private Multi-Tier Federated Learning approach specifically tailored for IoT applications, specifically short-term load forecasting. Our method integrates a Sampled Gaussian Mechanism for differential privacy with a hierarchical federated learning approach, where local federations participate in learning while adhering to approximate differential privacy with respect to a global server. We specifically study the optimal number of local rounds on global model convergence. Our findings demonstrate that non-DP models with fewer local rounds exhibit slightly superior performance compared to DP-enabled models. However, integrating DP by introducing additional noise during training with larger local rounds enhances the generalization of global models, suggesting that the sampled Gaussian mechanism functions as a form of regularization. In the evaluation of our method, we utilise an energy consumption dataset from the UK Power Networks Low Carbon London project. Our results show that our approach achieves privacy preserving objectives while obtaining the optimal number of local rounds that minimise the prediction error.
Description
Keywords
Federated learning , Internet of Things , Differential privacy , Sampled Gaussian Mechanism
Citation
Soleimani, R. and Pesch, D. (2024) ‘DP-MTFL: differentially private multi-tier federated learning for iot applications’, 2024 IEEE International Conference on Smart Computing (SMARTCOMP), Osaka, Japan, pp. 158–165. https://doi.org/10.1109/SMARTCOMP61445.2024.00042