Residential Collegefalse
Status已發表Published
Locality-aware and Fault-tolerant Batching for Machine Learning on Distributed Datasets
Liu, Liu1; Ding, Zhijun1; Cheng, Dazhao2; Zhou, Xiaobo3
2024
Source PublicationIEEE Transactions on Cloud Computing
ISSN2168-7161
Volume12Issue:2Pages:370-387
Abstract

The performance of distributed ML training is largely determined by workers that generate gradients in the slowest pace, i.e., stragglers. The state-of-the-art load balancing approaches consider that each worker stores a complete dataset locally and the data fetching time can be ignored. They only consider the computation capacity of workers in equalizing the gradient computation time. However, we find that in scenarios of ML on distributed datasets, whether in edge computing or distributed data cache systems, the data fetching time is non-negligible and often becomes the primary cause of stragglers. In this paper, we present LOFT, an adaptive load balancing approach for ML upon distributed datasets at the edge. It aims to balance the time to generate gradients at each worker while ensuring the model accuracy. Specifically, LOFT features a locality-aware batching. It builds performance and optimization models upon data fetching and gradient computation time. Leveraging the models, it develops an adaptive scheme based on grid search. Furthermore, it offers Byzantine gradient aggregation upon Ring All-Reduce, which makes itself fault-tolerant under Byzantine gradients brought by a small batch size. Experiments with twelve public DNN models and four open datasets show that LOFT reduces the training time by up to 46%, while reducing the training loss by up to 67% compared to LB-BSP.

KeywordAdaptation Models Byzantine Gradient Computational Modeling Data Models Distributed Databases Distributed Dataset Graphics Processing Units Load Management Machine Learning Training Straggler Training
DOI10.1109/TCC.2024.3351716
URLView the original
Indexed BySCIE
Language英語English
WOS Research AreaComputer Science
WOS SubjectComputer Science, Information Systems ; Computer Science, Software Engineering ; Computer Science, Theory & Methods
WOS IDWOS:001241591300012
PublisherIEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC, 445 HOES LANE, PISCATAWAY, NJ 08855-4141
Scopus ID2-s2.0-85182357532
Fulltext Access
Citation statistics
Document TypeJournal article
CollectionTHE STATE KEY LABORATORY OF INTERNET OF THINGS FOR SMART CITY (UNIVERSITY OF MACAU)
Corresponding AuthorZhou, Xiaobo
Affiliation1.Department of Computer Science and Technology, Tongji University, Shanghai, China
2.School of Computer Science, Wuhan University, Hubei, China
3.IOTSC Lab & Department of Computer and Information Science, University of Macau, Macau, China
Corresponding Author AffilicationUniversity of Macau
Recommended Citation
GB/T 7714
Liu, Liu,Ding, Zhijun,Cheng, Dazhao,et al. Locality-aware and Fault-tolerant Batching for Machine Learning on Distributed Datasets[J]. IEEE Transactions on Cloud Computing, 2024, 12(2), 370-387.
APA Liu, Liu., Ding, Zhijun., Cheng, Dazhao., & Zhou, Xiaobo (2024). Locality-aware and Fault-tolerant Batching for Machine Learning on Distributed Datasets. IEEE Transactions on Cloud Computing, 12(2), 370-387.
MLA Liu, Liu,et al."Locality-aware and Fault-tolerant Batching for Machine Learning on Distributed Datasets".IEEE Transactions on Cloud Computing 12.2(2024):370-387.
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Liu, Liu]'s Articles
[Ding, Zhijun]'s Articles
[Cheng, Dazhao]'s Articles
Baidu academic
Similar articles in Baidu academic
[Liu, Liu]'s Articles
[Ding, Zhijun]'s Articles
[Cheng, Dazhao]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Liu, Liu]'s Articles
[Ding, Zhijun]'s Articles
[Cheng, Dazhao]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.