TechRxiv

Agglomerative Federated Learning:Empowering Larger Model Training via End-Edge-Cloud Collaboration

Download (665.46 kB)
preprint
posted on 2023-12-07, 03:28 authored by Zhiyuan WuZhiyuan Wu, Sheng Sun, Yuwei Wang, Min Liu, Bo Gao, Quyang Pan, Tianliu He, Xuefeng Jiang

Federated Learning (FL) enables training Artificial Intelligence (AI) models over end devices without compromising their privacy. As computing tasks are increasingly performed by a combination of cloud, edge, and end devices, FL can benefit from this End-Edge-Cloud Collaboration (EECC) paradigm to achieve collaborative device-scale expansion with real-time access. Although Hierarchical Federated Learning (HFL) supports multi-tier model aggregation suitable for EECC, prior works assume the same model structure on all computing nodes, constraining the model scale by the weakest end devices. To address this issue, we propose Agglomerative Federated Learning (FedAgg), which is a novel EECC-empowered FL framework that allows the trained models from end, edge, to cloud to grow larger in size and stronger in generalization ability. FedAgg recursively organizes computing nodes among all tiers based on Bridge Sample Based Online Distillation Protocol (BSBODP), which enables every pair of parent-child computing nodes to mutually transfer and distill knowledge extracted from generated bridge samples. This design enhances the performance by exploiting the potential of larger models, with privacy constraints of FL and flexibility requirements of EECC both satisfied. Experiments under various settings demonstrate that FedAgg outperforms state-of-the-art methods by an average of 4.53% accuracy gains and remarkable improvements in convergence rate. Our code is available at https://github.com/wuzhiyuan2000/FedAgg.

Funding

National Natural Science Foundation of China (No. 62072436)

National Key Research and Development Program of China (No. 2021YFB2900102)

Innovation Capability Support Program of Shaanxi (No. 2023-CX-TD-08)

Shaanxi Qinchuangyuan "scientists+engineers" team (No.2023KXJ-040)

Innovation Funding of ICT, CAS (No.E261080)

History

Email Address of Submitting Author

wuzhiyuan22s@ict.ac.cn

ORCID of Submitting Author

0000-0002-8925-4896

Submitting Author's Institution

Institute of Computing Technology, Chinese Academy of Sciences

Submitting Author's Country

  • China

Usage metrics

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC