Residential College | false |
Status | 已發表Published |
Snowball Effect in Federated Learning: An Approach of Exponentially Expanding Structures for Optimizing the Training Efficiency | |
Cheng, Guoliang1,2; Li, Peichun3,4; Tan, Beihai5; Yu, Rong1,2; Wu, Yuan3,4; Pan, Miao6 | |
2024-10-14 | |
Source Publication | IEEE Transactions on Cognitive Communications and Networking |
ISSN | 2332-7731 |
Abstract | Efficient federated learning (FL) in mobile edge networks faces challenges due to energy-consuming on-device training and wireless transmission. Optimizing the neural network structures is an effective approach to achieving energy savings. In this paper, we present a Snowball FL training with expanding neural network structure, which starts with a small-sized submodel and gradually progresses to a full-sized model. To achieve this, we first design the submodel and embedding extraction schemes for fine-grained model structure expansion. We then investigate the joint minimization problem of the global training loss and system-wise energy consumption. After that, we decompose the optimization problem into a long-term model structure expansion subproblem and a single-round local resource allocation subproblem. Specifically, the former subproblem is transformed into a variational calculus problem by leveraging theoretical analysis of the convergence bound. The Euler-Lagrange method is used to derive the solution, where the optimal evolution strategy for the model structure exponentially increases with the global round (i.e., the Snowball effect). Meanwhile, the latter subproblem is solved by convex optimization to acquire the optimal computing frequency and transmission power. Experiments indicate that the proposed framework can save about 50% of energy consumption to achieve on-par accuracy against state-of-the-art algorithms. |
Keyword | Federated Learning Distributed Computing Resource Management Optimization Methods |
DOI | 10.1109/TCCN.2024.3480045 |
URL | View the original |
Language | 英語English |
Scopus ID | 2-s2.0-85207358987 |
Fulltext Access | |
Citation statistics | |
Document Type | Journal article |
Collection | Faculty of Science and Technology THE STATE KEY LABORATORY OF INTERNET OF THINGS FOR SMART CITY (UNIVERSITY OF MACAU) DEPARTMENT OF COMPUTER AND INFORMATION SCIENCE |
Corresponding Author | Tan, Beihai; Yu, Rong |
Affiliation | 1.Guangdong University of Technology, School of Automation, Guangzhou, China 2.Guangdong Key Laboratory of IoT Information Technology, Guangzhou, China 3.University of Macau, State Key Laboratory of Internet of Things for Smart City, Macao 4.University of Macau, Department of Computer and Information Science, Macao 5.Guangdong University of Technology, School of Integrated Circuits, Guangzhou, China 6.University of Houston, Department of Electrical and Computer Engineering, Houston, United States |
Recommended Citation GB/T 7714 | Cheng, Guoliang,Li, Peichun,Tan, Beihai,et al. Snowball Effect in Federated Learning: An Approach of Exponentially Expanding Structures for Optimizing the Training Efficiency[J]. IEEE Transactions on Cognitive Communications and Networking, 2024. |
APA | Cheng, Guoliang., Li, Peichun., Tan, Beihai., Yu, Rong., Wu, Yuan., & Pan, Miao (2024). Snowball Effect in Federated Learning: An Approach of Exponentially Expanding Structures for Optimizing the Training Efficiency. IEEE Transactions on Cognitive Communications and Networking. |
MLA | Cheng, Guoliang,et al."Snowball Effect in Federated Learning: An Approach of Exponentially Expanding Structures for Optimizing the Training Efficiency".IEEE Transactions on Cognitive Communications and Networking (2024). |
Files in This Item: | There are no files associated with this item. |
Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.
Edit Comment