skip to main content
10.1145/3576842.3582377acmconferencesArticle/Chapter ViewAbstractPublication PagesiotdiConference Proceedingsconference-collections
research-article
Open Access

Async-HFL: Efficient and Robust Asynchronous Federated Learning in Hierarchical IoT Networks

Published:09 May 2023Publication History

ABSTRACT

Federated Learning (FL) has gained increasing interest in recent years as a distributed on-device learning paradigm. However, multiple challenges remain to be addressed for deploying FL in real-world Internet-of-Things (IoT) networks with hierarchies. Although existing works have proposed various approaches to account data heterogeneity, system heterogeneity, unexpected stragglers and scalibility, none of them provides a systematic solution to address all of the challenges in a hierarchical and unreliable IoT network. In this paper, we propose an asynchronous and hierarchical framework (Async-HFL) for performing FL in a common three-tier IoT network architecture. In response to the largely varied networking and system processing delays, Async-HFL employs asynchronous aggregations at both the gateway and cloud levels thus avoids long waiting time. To fully unleash the potential of Async-HFL in converging speed under system heterogeneities and stragglers, we design device selection at the gateway level and device-gateway association at the cloud level. Device selection module chooses diverse and fast edge devices to trigger local training in real-time while device-gateway association module determines the efficient network topology periodically after several cloud epochs, with both modules satisfying bandwidth limitations. We evaluate Async-HFL’s convergence speedup using large-scale simulations based on ns-3 and a network topology from NYCMesh. Our results show that Async-HFL converges 1.08-1.31x faster in wall-clock time and saves up to 21.6% total communication cost compared to state-of-the-art asynchronous FL algorithms (with client selection). We further validate Async-HFL on a physical deployment and observe its robust convergence under unexpected stragglers.

Skip Supplemental Material Section

Supplemental Material

References

  1. 2022. High Performance Wireless Research & Education Network (HPWREN). http://hpwren.ucsd.edu/ [Online].Google ScholarGoogle Scholar
  2. 2022. New York City (NYC) Mesh. https://www.nycmesh.net/map/ [Online].Google ScholarGoogle Scholar
  3. 2022. ns-3: a discrete-event network simulator for internet systems. https://www.nsnam.org/ [Online].Google ScholarGoogle Scholar
  4. Mehdi Salehi Heydar Abad, Emre Ozfatura, Deniz Gunduz, and Ozgur Ercetin. 2020. Hierarchical federated learning across heterogeneous cellular networks. In ICASSP. IEEE, 8866–8870.Google ScholarGoogle Scholar
  5. Alaa Awad Abdellatif, Naram Mhaisen, Amr Mohamed, Aiman Erbad, Mohsen Guizani, Zaher Dawy, and Wassim Nasreddine. 2022. Communication-efficient hierarchical federated learning for IoT heterogeneous systems with imbalanced data. Future Generation Computer Systems 128 (2022), 406–419.Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Irfan Ahmad and Karunakar Pothuganti. 2020. Design & implementation of real time autonomous car by using image processing & IoT. In ICSSIT. IEEE, 107–113.Google ScholarGoogle Scholar
  7. Davide Anguita, Alessandro Ghio, Luca Oneto, Xavier Parra Perez, and Jorge Luis Reyes Ortiz. 2013. A public domain dataset for human activity recognition using smartphones. In ESANN. 437–442.Google ScholarGoogle Scholar
  8. Ravikumar Balakrishnan, Tian Li, Tianyi Zhou, Nageen Himayat, Virginia Smith, and Jeff Bilmes. 2021. Diverse Client Selection for Federated Learning via Submodular Maximization. In ICLR.Google ScholarGoogle Scholar
  9. Sándor Beniczky, Philippa Karoly, Ewan Nurse, Philippe Ryvlin, and Mark Cook. 2021. Machine learning and wearable devices of the future. Epilepsia 62 (2021), S116–S124.Google ScholarGoogle ScholarCross RefCross Ref
  10. Sebastian Caldas, Sai Meher Karthik Duddu, Peter Wu, Tian Li, Jakub Konečnỳ, H Brendan McMahan, Virginia Smith, and Ameet Talwalkar. 2018. Leaf: A benchmark for federated settings. arXiv preprint arXiv:1812.01097 (2018).Google ScholarGoogle Scholar
  11. Zheng Chai, Ahsan Ali, Syed Zawad, Stacey Truex, Ali Anwar, Nathalie Baracaldo, Yi Zhou, Heiko Ludwig, Feng Yan, and Yue Cheng. 2020. Tifl: A tier-based federated learning system. In HPDC. 125–136.Google ScholarGoogle Scholar
  12. Zheng Chai, Yujing Chen, Liang Zhao, Yue Cheng, and Huzefa Rangwala. 2020. Fedat: A communication-efficient federated learning method with asynchronous tiers under non-iid data. arXiv preprin arxiv:2010.05958 (2020).Google ScholarGoogle Scholar
  13. et al. Chaoyang He. 2020. Fedml: A research library and benchmark for federated machine learning. arXiv preprint arXiv:2007.13518 (2020).Google ScholarGoogle Scholar
  14. Mingzhe Chen, H Vincent Poor, Walid Saad, and Shuguang Cui. 2020. Convergence time minimization of federated learning over wireless networks. In ICC. IEEE, 1–6.Google ScholarGoogle Scholar
  15. Shuai Chen, Xiumin Wang, Pan Zhou, Weiwei Wu, Weiwei Lin, and Zhenyu Wang. 2022. Heterogeneous Semi-Asynchronous Federated Learning in Internet of Things: A Multi-Armed Bandit Approach. IEEE Transactions on Emerging Topics in Computational Intelligence 6, 5 (2022), 1113–1124.Google ScholarGoogle ScholarCross RefCross Ref
  16. Zheyi Chen, Weixian Liao, Kun Hua, Chao Lu, and Wei Yu. 2021. Towards asynchronous federated learning for heterogeneous edge-powered internet of things. Digital Communications and Networks 7, 3 (2021), 317–326.Google ScholarGoogle ScholarCross RefCross Ref
  17. Li Deng. 2012. The mnist database of handwritten digit images for machine learning research. IEEE Signal Processing Magazine (2012).Google ScholarGoogle ScholarCross RefCross Ref
  18. Yongheng Deng, Feng Lyu, Ju Ren, Yongmin Zhang, Yuezhi Zhou, Yaoxue Zhang, and Yuanyuan Yang. 2021. SHARE: Shaping Data Distribution at Edge for Communication-Efficient Hierarchical Federated Learning. In ICDCS. IEEE, 24–34.Google ScholarGoogle Scholar
  19. Emily Ekaireb, Xiaofan Yu, Kazim Ergun, Quanling Zhao, Kai Lee, Muhammad Huzaifa, and Tajana Rosing. 2022. ns3-fl: Simulating Federated Learning with ns-3. In WNS-3. 97–104.Google ScholarGoogle Scholar
  20. Chenyuan Feng, Howard H Yang, Deshun Hu, Zhiwei Zhao, Tony QS Quek, and Geyong Min. 2022. Mobility-aware cluster federated learning in hierarchical wireless networks. IEEE Transactions on Wireless Communications 21, 10 (2022), 8441–8458.Google ScholarGoogle ScholarCross RefCross Ref
  21. Arnaud Fréville. 2004. The multidimensional 0–1 knapsack problem: An overview. European Journal of Operational Research 155 (2004).Google ScholarGoogle Scholar
  22. Gurobi Optimization, LLC. 2022. Gurobi Optimizer Reference Manual. https://www.gurobi.comGoogle ScholarGoogle Scholar
  23. Jiangshan Hao, Yanchao Zhao, and Jiale Zhang. 2020. Time efficient federated learning with semi-asynchronous communication. In ICPADS. IEEE.Google ScholarGoogle Scholar
  24. Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. 2016. Deep residual learning for image recognition. In CVPR. 770–778.Google ScholarGoogle Scholar
  25. Chung-Hsuan Hu, Zheng Chen, and Erik G Larsson. 2021. Device scheduling and update aggregation policies for asynchronous federated learning. In SPAWC. IEEE, 281–285.Google ScholarGoogle Scholar
  26. Dzmitry Huba, John Nguyen, Kshitiz Malik, Ruiyu Zhu, Mike Rabbat, Ashkan Yousefpour, Carole-Jean Wu, Hongyuan Zhan, Pavel Ustinov, Harish Srinivas, 2022. Papaya: Practical, private, and scalable federated learning. Proceedings of Machine Learning and Systems 4 (2022), 814–832.Google ScholarGoogle Scholar
  27. Ahmed Imteaj and M Hadi Amini. 2020. Fedar: Activity and resource-aware federated learning model for distributed mobile robots. In ICMLA. IEEE.Google ScholarGoogle Scholar
  28. Nabaa Ali Jasim, Haider TH, and Salim AL Rikabi. 2021. Design and implementation of smart city applications based on the internet of things.iJIM 15, 13 (2021).Google ScholarGoogle Scholar
  29. Sai Praneeth Karimireddy, Satyen Kale, Mehryar Mohri, Sashank Reddi, Sebastian Stich, and Ananda Theertha Suresh. 2020. Scaffold: Stochastic controlled averaging for federated learning. In ICML. PMLR, 5132–5143.Google ScholarGoogle Scholar
  30. Latif U Khan, Shashi Raj Pandey, Nguyen H Tran, Walid Saad, Zhu Han, Minh NH Nguyen, and Choong Seon Hong. 2020. Federated learning for edge networks: Resource optimization and incentive mechanism. IEEE Communications Magazine 58, 10 (2020), 88–93.Google ScholarGoogle ScholarCross RefCross Ref
  31. Alex Krizhevsky, Geoffrey Hinton, 2009. Learning multiple layers of features from tiny images. (2009).Google ScholarGoogle Scholar
  32. Fan Lai, Xiangfeng Zhu, Harsha V Madhyastha, and Mosharaf Chowdhury. 2021. Oort: Efficient federated learning via guided participant selection. In OSDI. 19–35.Google ScholarGoogle Scholar
  33. Hyun-Suk Lee and Jang-Won Lee. 2021. Adaptive transmission scheduling in wireless networks for asynchronous federated learning. IEEE Journal on Selected Areas in Communications 39, 12 (2021), 3673–3687.Google ScholarGoogle ScholarCross RefCross Ref
  34. Ang Li, Jingwei Sun, Pengcheng Li, Yu Pu, Hai Li, and Yiran Chen. 2021. Hermes: an efficient federated learning framework for heterogeneous mobile clients. In MobiCom. 420–437.Google ScholarGoogle Scholar
  35. Ang Li, Jingwei Sun, Xiao Zeng, Mi Zhang, Hai Li, and Yiran Chen. 2021. Fedmask: Joint computation and communication-efficient personalized federated learning via heterogeneous masking. In SenSys. 42–55.Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. Chenning Li, Xiao Zeng, Mi Zhang, and Zhichao Cao. 2022. PyramidFL: A fine-grained client selection framework for efficient federated learning. In MobiCom. 158–171.Google ScholarGoogle Scholar
  37. Tian Li, Anit Kumar Sahu, Manzil Zaheer, Maziar Sanjabi, Ameet Talwalkar, and Virginia Smith. 2020. Federated optimization in heterogeneous networks. Proceedings of Machine learning and systems 2 (2020), 429–450.Google ScholarGoogle Scholar
  38. Tian Li, Maziar Sanjabi, Ahmad Beirami, and Virginia Smith. 2019. Fair resource allocation in federated learning. arXiv preprint arXiv:1905.10497 (2019).Google ScholarGoogle Scholar
  39. Lumin Liu, Jun Zhang, SH Song, and Khaled B Letaief. 2020. Client-edge-cloud hierarchical federated learning. In ICC. IEEE, 1–6.Google ScholarGoogle Scholar
  40. Siqi Luo, Xu Chen, Qiong Wu, Zhi Zhou, and Shuai Yu. 2020. Hfel: Joint edge association and resource allocation for cost-efficient hierarchical federated edge learning. IEEE Transactions on Wireless Communications 19, 10 (2020), 6535–6548.Google ScholarGoogle ScholarCross RefCross Ref
  41. Brendan McMahan, Eider Moore, Daniel Ramage, Seth Hampson, and Blaise Aguera y Arcas. 2017. Communication-efficient learning of deep networks from decentralized data. In AISTATS. PMLR, 1273–1282.Google ScholarGoogle Scholar
  42. Aritra Mitra, Rayana Jaafar, George J Pappas, and Hamed Hassani. 2021. Linear convergence in federated learning: Tackling client heterogeneity and sparse gradients. Advances in Neural Information Processing Systems 34 (2021), 14606–14619.Google ScholarGoogle Scholar
  43. John Nguyen, Kshitiz Malik, Hongyuan Zhan, Ashkan Yousefpour, Mike Rabbat, Mani Malek, and Dzmitry Huba. 2022. Federated learning with buffered asynchronous aggregation. In AISTATS. PMLR, 3581–3607.Google ScholarGoogle Scholar
  44. Monica Ribero and Haris Vikalo. 2020. Communication-efficient federated learning via optimal client sampling. arXiv preprint arXiv:2007.15197 (2020).Google ScholarGoogle Scholar
  45. Kaixin Sui, Mengyu Zhou, Dapeng Liu, Minghua Ma, Dan Pei, Youjian Zhao, Zimu Li, and Thomas Moscibroda. 2016. Characterizing and improving wifi latency in large-scale operational networks. In MobiSys. 347–360.Google ScholarGoogle Scholar
  46. Alysa Ziying Tan, Han Yu, Lizhen Cui, and Qiang Yang. 2022. Towards personalized federated learning. IEEE Trans. Neural Netw. Learn. Syst. (2022).Google ScholarGoogle ScholarCross RefCross Ref
  47. Hao Wang, Zakhary Kaplan, Di Niu, and Baochun Li. 2020. Optimizing federated learning on non-iid data with reinforcement learning. In INFOCOM. IEEE, 1698–1707.Google ScholarGoogle Scholar
  48. Jianyu Wang, Qinghua Liu, Hao Liang, Gauri Joshi, and H Vincent Poor. 2020. Tackling the objective inconsistency problem in heterogeneous federated optimization. Adv Neural Inf Process Syst 33 (2020), 7611–7623.Google ScholarGoogle Scholar
  49. Zhiyuan Wang, Hongli Xu, Jianchun Liu, He Huang, Chunming Qiao, and Yangming Zhao. 2021. Resource-efficient federated learning with hierarchical aggregation in edge computing. In INFOCOM. IEEE, 1–10.Google ScholarGoogle Scholar
  50. Zhongyu Wang, Zhaoyang Zhang, Yuqing Tian, Qianqian Yang, Hangguan Shan, Wei Wang, and Tony QS Quek. 2022. Asynchronous federated learning over wireless communication networks. IEEE Transactions on Wireless Communications 21, 9 (2022), 6961–6978.Google ScholarGoogle ScholarDigital LibraryDigital Library
  51. Wentai Wu, Ligang He, Weiwei Lin, Rui Mao, Carsten Maple, and Stephen Jarvis. 2020. Safa: a semi-asynchronous protocol for fast federated learning with low overhead. IEEE Trans. Comput. 70, 5 (2020), 655–668.Google ScholarGoogle ScholarDigital LibraryDigital Library
  52. Han Xiao, Kashif Rasul, and Roland Vollgraf. 2017. Fashion-mnist: a novel image dataset for benchmarking machine learning algorithms.Google ScholarGoogle Scholar
  53. Cong Xie, Sanmi Koyejo, and Indranil Gupta. 2019. Asynchronous federated optimization. arXiv preprint arXiv:1903.03934 (2019).Google ScholarGoogle Scholar
  54. Bo Xu, Wenchao Xia, Jun Zhang, Tony QS Quek, and Hongbo Zhu. 2021. Online client scheduling for fast federated learning. IEEE Wirel. Commun. Lett. 10, 7 (2021), 1434–1438.Google ScholarGoogle ScholarCross RefCross Ref
  55. Bo Xu, Wenchao Xia, Jun Zhang, Xinghua Sun, and Hongbo Zhu. 2021. Dynamic client association for energy-aware hierarchical federated learning. In WCNC. IEEE, 1–6.Google ScholarGoogle Scholar
  56. Jaehong Yoon, Divyam Madaan, Eunho Yang, and Sung Ju Hwang. 2021. Online coreset selection for rehearsal-based continual learning. arXiv preprint arXiv:2106.01085 (2021).Google ScholarGoogle Scholar
  57. Linlin You, Sheng Liu, Yi Chang, and Chau Yuen. 2022. A triple-step asynchronous federated learning mechanism for client activation, interaction optimization, and aggregation enhancement. IEEE Internet of Things Journal (2022).Google ScholarGoogle ScholarCross RefCross Ref
  58. Yu Zhang, Morning Duan, Duo Liu, Li Li, Ao Ren, Xianzhang Chen, Yujuan Tan, and Chengliang Wang. 2021. CSAFL: A clustered semi-asynchronous federated learning framework. In IJCNN. IEEE, 1–10.Google ScholarGoogle Scholar
  59. Zhengyi Zhong, Weidong Bao, Ji Wang, Xiaomin Zhu, and Xiongtao Zhang. 2022. FLEE: A hierarchical federated learning framework for distributed deep neural network over cloud, edge and end device. ACM TIST (2022).Google ScholarGoogle Scholar
  60. Chendi Zhou, Hao Tian, Hong Zhang, Jin Zhang, Mianxiong Dong, and Juncheng Jia. 2021. TEA-fed: time-efficient asynchronous federated learning for edge computing. In ACM CF. 30–37.Google ScholarGoogle Scholar
  61. Hongbin Zhu, Yong Zhou, Hua Qian, Yuanming Shi, Xu Chen, and Yang Yang. 2022. Online client selection for asynchronous federated learning with fairness consideration. IEEE Transactions on Wireless Communications (2022).Google ScholarGoogle Scholar

Recommendations

Comments

Login options

Check if you have access through your login credentials or your institution to get full access on this article.

Sign in
  • Published in

    cover image ACM Conferences
    IoTDI '23: Proceedings of the 8th ACM/IEEE Conference on Internet of Things Design and Implementation
    May 2023
    514 pages
    ISBN:9798400700378
    DOI:10.1145/3576842

    Copyright © 2023 Owner/Author

    This work is licensed under a Creative Commons Attribution International 4.0 License.

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    • Published: 9 May 2023

    Check for updates

    Qualifiers

    • research-article
    • Research
    • Refereed limited

PDF Format

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format .

View HTML Format