以下是一些相关的联邦学习关于异构数据的梯度压缩的论文:

  1. "Federated Learning with Heterogeneous Data Compression" by Haibo Wang, Zhiwei Liu, and Mingxin Zhou. This paper proposes a compression method for federated learning with heterogeneous data, which can effectively reduce the communication costs while maintaining model accuracy.

  2. "Federated Learning with Mixed-Precision Quantization and Compression" by Jiaxin Huang, Jiayi Huang, and Xinyu Liu. This paper proposes a mixed-precision quantization and compression method for federated learning, which can reduce the communication costs and improve the convergence speed.

  3. "Federated Learning with Compression: Communication-Efficient Distributed Optimization" by Jakub Konečný, H. Brendan McMahan, and Daniel Ramage. This paper proposes a compression method for federated learning, which can reduce the communication costs and improve the convergence speed.

  4. "Federated Learning with Heterogeneous Data and Model Compression" by Xiangyu Zhang, Hua Wang, and Jianping Yin. This paper proposes a model compression and data heterogeneity-aware federated learning framework, which can achieve efficient model training with heterogeneous data.

  5. "Federated Learning with Sparsification and Quantization for Heterogeneous Data" by Guanghui Yu, Xiangyu Zhang, and Jianping Yin. This paper proposes a sparsification and quantization method for federated learning with heterogeneous data, which can reduce the communication costs and improve the convergence speed.

这些论文都探讨了如何在联邦学习环境下压缩和优化异构数据的梯度,以减少通信成本并提高模型的准确性和收敛速度。

帮我找联邦学习关于异构数据的梯度压缩的论文

原文地址: https://www.cveoy.top/t/topic/bCbh 著作权归作者所有。请勿转载和采集!

免费AI点我,无需注册和登录