Research Article
Open access
Published on 22 February 2024
Download pdf
Zhang,Z. (2024). Battery-aware federated learning: Challenges and solutions. Applied and Computational Engineering,41,112-118.
Export citation

Battery-aware federated learning: Challenges and solutions

Ziyue Zhang *,1,
  • 1 Wuhan Britain-China School

* Author to whom correspondence should be addressed.

https://doi.org/10.54254/2755-2721/41/20230721

Abstract

Smartphone battery life is a pivotal factor in consumers' purchasing decisions. Recent years have witnessed a surge in studies focusing on smartphone energy management, with data-driven energy management systems offering solutions to prolong battery life. Federated Learning (FL) emerges as a promising distributed learning algorithm, enabling wireless devices to upload locally trained models, fostering collaborative learning without exposing sensitive data. This paper explores the FL process, particularly the Federated Averaging (FedAvg) approach, which excels in scenarios with homogeneous data. In the era of burgeoning data generation, traditional cloud computing systems face limitations, driving the adoption of Edge Computing (EC), which processes data closer to its source, enhancing response times. To make FL efficient for e-commerce, resource constraints must be addressed. This involves techniques like local updates and model compression, which reduce communication overhead. However, FL brings challenges related to data distribution heterogeneity and privacy concerns. Solutions like differential privacy, encryption, and access control are discussed. In conclusion, this paper presents an overview of smartphone battery life, data-driven energy management, and the potential of FL, emphasizing its relevance in the age of EC. By addressing resource limitations and privacy issues, FL holds promise for efficient data processing.

Keywords

The Federated Learning, Battery Life Extension, FedAvg

[1]. E. Kim, H. Jeong, J. Yang and M. Song, “Balancing energy use against video quality in mobile devices,” in IEEE Transactions on Consumer Electronics, vol. 60, no. 3, pp. 517-524, Aug. 2014.

[2]. H. B. McMahan, E. Moore, D. Ramage, S. Hampson, and B. A. Y. Arcas, “Communication-efficient learning of deep networks from decentralized data,” 2016, arXiv:1602.05629. [Online]. Available: http://arxiv.org/abs/1602.05629

[3]. J. Koneˇcný, H. B. McMahan, D. Ramage, and P. Richtárik, “Federated optimization: Distributed machine learning for on device intelligence,” 2016, arXiv:1610.02527. [Online]. Available: http://arxiv.org/abs/1610.02527

[4]. W. Shi, J. Cao, Q. Zhang, Y. Li, and L. Xu, “Edge computing: Vision and challenges,” IEEE Internet Things J., vol. 3, no. 5, pp. 637–646, Oct. 2016.

[5]. W. Y. B. Lim et al., “Federated learning in mobile edge networks: A comprehensive survey,” IEEE Commun. Surv. Tuts., vol. 22, no. 3, pp. 2031–2063, Jul.–Sep. 2020.

[6]. J. Wu, W. Huang, J. Huang, and T. Zhang, “Error compensated quantized SGD and its applications to large-scale distributed optimization,” in Proc. Int. Conf. Mach. Learn., 2018, pp. 5325–5333.

[7]. X. Sun, X. Ren, S. Ma, and H. Wang, “meProp: Sparsified back propagation for accelerated deep learning with reduced over-fitting,” in Proc. Int. Conf. Mach. Learn., 2017, pp. 3299–3308.

[8]. Peter Kairouz, H Brendan McMahan, Brendan Avent, Aurélien Bellet, Mehdi Bennis, Arjun Nitin Bhagoji, Keith Bonawitz, Zachary Charles, Graham Cormode, Rachel Cummings, et al. Advances and open problems in federated learning. arXiv preprint arXiv:1912.04977, 2019.

[9]. Sai Praneeth Karimireddy, Satyen Kale, Mehryar Mohri, Sashank J Reddi, Sebastian U Stich, and Ananda Theertha Suresh. Scaffold: Stochastic controlled averaging for ondevice federated learning. In Proceedings of the 37th International Conference on Machine Learning. PMLR, 2020.

[10]. Aniruddh Herle, and Janamejaya Channegowda. “Federated Learning: An Energy Efficiency Perspective to Extend Smartphone Battery Life.” 2022 IEEE Delhi Section Conference (DELCON), 11 Feb. 2022, https://doi.org/10.1109/delcon54057.2022.9753113. Accessed 2 Sept. 2023.

[11]. J. Konecˇny`, H. B. McMahan, F. X. Yu, P. Richtárik, A. T. Suresh, and D. Bacon, Federated learning: Strategies for improving communication efficiency. 2016. [Online]. Available: arXiv:1610.05492.

[12]. S. Caldas, J. Konecˇny, H. B. McMahan, and A. Talwalkar, Expanding the reach of federated learning by reducing client resource requirements. 2018. [Online]. Available: arXiv:1812.07210

[13]. H. B. McMahan, E. Moore, D. Ramage, S. Hampson, and B. A. y Arcas, “Communication-efficient learning of deep networks from decentralized data,” in AISTATS, 2017.

Cite this article

Zhang,Z. (2024). Battery-aware federated learning: Challenges and solutions. Applied and Computational Engineering,41,112-118.

Data availability

The datasets used and/or analyzed during the current study will be available from the authors upon reasonable request.

Disclaimer/Publisher's Note

The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of EWA Publishing and/or the editor(s). EWA Publishing and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

About volume

Volume title: Proceedings of the 2023 International Conference on Machine Learning and Automation

Conference website: https://2023.confmla.org/
ISBN:978-1-83558-307-4(Print) / 978-1-83558-308-1(Online)
Conference date: 18 October 2023
Editor:Mustafa İSTANBULLU
Series: Applied and Computational Engineering
Volume number: Vol.41
ISSN:2755-2721(Print) / 2755-273X(Online)

© 2024 by the author(s). Licensee EWA Publishing, Oxford, UK. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license. Authors who publish this series agree to the following terms:
1. Authors retain copyright and grant the series right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this series.
2. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the series's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this series.
3. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See Open access policy for details).