联邦学习论文整理

一、综述

  1. Lim W Y B, Luong N C, Hoang D T, et al. Federated learning in mobile edge networks: A comprehensive survey[J]. IEEE Communications Surveys & Tutorials, 2020, 2
  2. Yang Q, Liu Y, Cheng Y, et al. Federated learning[J]. Synthesis Lectures on Artificial Intelligence and Machine Learning, 2019, 13(3): 1-207.
  3. 杨强.AI与数据隐私保护:联邦学习的破解之道[J].信息安全研究,2019,5(11):961-965.
  4. Tan A Z, Yu H, Cui L, et al. Towards personalized federated learning[J]. IEEE Transactions on Neural Networks and Learning Systems, 2022.
  5. Li L, Fan Y, Tse M, et al. A review of applications in federated learning[J]. Computers & Industrial Engineering, 2020, 149: 106854.
  6. 王健宗,孔令炜,黄章成,陈霖捷,刘懿,何安珣,肖京.联邦学习算法综述[J].大数据,2020,6(06):64-82.
  7. 杨庚,王周生.联邦学习中的隐私保护研究进展[J].南京邮电大学学报(自然科学版), 2020,40(05):204-214.DOI:10.14132/j.cnki.1673-5439.2020.05.022.
  8. 周传鑫,孙奕,汪德刚,葛桦玮.联邦学习研究综述[J].网络与信息安全学报,2021,7(05):77-92.
  9. 王健宗,孔令炜,黄章成,陈霖捷,刘懿,卢春曦,肖京.联邦学习隐私保护研究进展[J].大数据,2021,7(03):130-149.
  10. 刘艺璇,陈红,刘宇涵,李翠平.联邦学习中的隐私保护技术[J].软件学报,2022,33(03):1057-1092.DOI:10.13328/j.cnki.jos.006446.

医学领域

  1. Nguyen D C, Pham Q V, Pathirana P N, et al. Federated learning for smart healthcare: A survey[J]. ACM Computing Surveys (CSUR), 2022, 55(3): 1-37.
  2. Shyu C R, Putra K T, Chen H C, et al. A systematic review of federated learning in the healthcare area: From the perspective of data properties and applications[J]. Applied Sciences, 2021, 11(23): 11191.

自然语言处理

  1. Liu M, Ho S, Wang M, et al. Federated learning meets natural language processing: A survey[J]. arXiv preprint arXiv:2107.12603, 2021.
  2. Ji S, Pan S, Long G, et al. Learning private neural language modeling with attentive aggregation[C]//2019 International joint conference on neural networks (IJCNN). IEEE, 2019: 1-8.(FedAtt)

平台研发

  1. Li Q, Wen Z, Wu Z, et al. A survey on federated learning systems: vision, hype and reality for data privacy and protection[J]. IEEE Transactions on Knowledge and Data Engineering, 2021.

二、非独立同分布数据的收敛

  1. Li, T., Sahu, A. K., Zaheer, M., Sanjabi, M., Talwalkar, A., and Smith, V . Federated optimization in heterogeneous networks. In MLSys, 2020a.
  2. Wang, S., Tuor, T., Salonidis, T., Leung, K. K., Makaya, C., He, T., and Chan, K. Adaptive federated learning in resource constrained edge computing systems. IEEE Journal on Selected Areas in Communications, 37(6): 1205–1221, 2019.
  3. Khaled, A., Mishchenko, K., and Richt´arik, P . First analysis of local GD on heterogeneous data. In NeurIPSW, 2019.
  4. Li, X., Huang, K., Y ang, W., Wang, S., and Zhang, Z. On the convergence of fedavg on non-iid data. In ICLR, 2020b.
  5. Hsieh, K., Phanishayee, A., Mutlu, O., and Gibbons, P . Thenon-iid data quagmire of decentralized machine learning. In ICML, 2020.
  6. Wang, J., Liu, Q., Liang, H., Joshi, G., and Poor, H. V . Tackling the objective inconsistency problem in heterogeneous federated optimization. In NeurIPS, 2020.

三、优化方法

  1. FedAvg. McMahan B, Moore E, Ramage D, et al. Communication-efficient learning of deep networks from decentralized data[C]//Artificial intelligence and statistics. PMLR, 2017: 1273-1282.
  2. FedProx. Li T, Sahu A K, Zaheer M, et al. Federated optimization in heterogeneous networks[J]. Proceedings of Machine Learning and Systems, 2020, 2: 429-450.(增加二次惩罚项)
  3. SCAFFOLD. Karimireddy S P, Kale S, Mohri M, et al. Scaffold: Stochastic controlled averaging for federated learning[C]//International Conference on Machine Learning. PMLR, 2020: 5132-5143.(增加控制变量)
  4. Li T, Hu S, Beirami A, et al. Ditto: Fair and robust federated learning through personalization[C]//International Conference on Machine Learning. PMLR, 2021: 6357-6368.
  5. T Dinh C, Tran N, Nguyen J. Personalized federated learning with moreau envelopes[J]. Advances in Neural Information Processing Systems, 2020, 33: 21394-21405.(pFedMe)
  6. Li, T., Sahu, A. K., Zaheer, M., Sanjabi, M., Talwalkar, A., and Smithy, V . FedDANE: A federated newton-type method. In ACSCC, 2019.(增加控制变量)
  7. Zhang, X., Hong, M., Dhople, S., Yin, W., and Liu, Y . FedPD: a federated learning framework with optimal rates and adaptivity to non-iid data. In arXiv preprintarXiv:2005.11418, 2020.
  8. FedDyn. Acar, D. A. E., Zhao, Y ., Matas, R., Mattina, M., Whatmough, P ., and Saligrama, V . Federated learning based on dynamic regularization. In ICLR, 2021.
  9. Y oon, T., Shin, S., Hwang, S. J., and Y ang, E. Fedmix: Approximation of mixup under mean augmented federated learning. In ICLR, 2021.(数据增强)
  10. Li, Q., He, B., and Song, D. Model-contrastive federated learning. In CVPR, 2021.(对比学习)
  11. Deng Y, Kamani M M, Mahdavi M. Adaptive personalized federated learning[J]. arXiv preprint arXiv:2003.13461, 2020.
    以上通常需要高参与率、额外的通信成本或客户端额外的内存需求。
  12. FedAvgM. Hsu, T.-M. H., Qi, H., and Brown, M. Measuring the effects of non-identical data distribution for federated visual classification. arXiv preprint arXiv:1909.06335, 2019.(服务器端增加动量项加速收敛)
  13. FedADAM. Reddi, S. J., Charles, Z., Zaheer, M., Garrett, Z., Rush, K., Koneˇcn`y, J., Kumar, S., and McMahan, H. B. Adaptive federated optimization. In ICLR, 2021.(自适应梯度下降方法)
  14. Seo, H., Park, J., Oh, S., Bennis, M., and Kim, S.L. Federated knowledge distillation. arXiv preprint arXiv:2011.02367, 2020.(联邦知识蒸馏——需要辅助数据)
  15. Lee, G., Shin, Y ., Jeong, M., and Y un, S.-Y . Preservation of the global knowledge by not-true self knowledge distillation in federated learning. arXiv preprint arXiv:2106.03097, 2021.(将全局模型的知识转移到本地网络)
  16. Yao, D., Pan, W., Dai, Y ., Wan, Y ., Ding, X., Jin, H., Xu, Z., and Sun, L. Local-global knowledge distillation in heterogeneous federated learning with non-iid data. arXiv preprint arXiv:2107.00051, 2021.(使用来自历史全局模型集合的表示来细化局部模型——需额外通信开销)
  17. Zhu, Z., Hong, J., and Zhou, J. Data-free knowledge distillation for heterogeneous federated learning. In ICML, 2021.(学习全局生成器来聚合局部信息并将全局知识提取到客户端——需额外通信开销)
  18. Lin, T., Kong, L., Stich, S. U., and Jaggi, M. Ensemble distillation for robust model fusion in federated learning. In NeurIPS, 2020.(利用代理数据上本地模型的平均表示进行聚合——服务器端知识蒸馏——需要辅助数据)
  19. FedMLB. Kim J, Kim G, Han B. Multi-level branched regularization for federated learning[C]//International Conference on Machine Learning. PMLR, 2022: 11058-11073.(基于知识蒸馏的局部优化方法,不需要额外的通信成本,也不需要辅助数据——以上大部分从本文获得)
    Note:联邦学习算法对通信成本敏感,应谨慎使用全局共享的辅助数据。
  20. Fallah A, Mokhtari A, Ozdaglar A. Personalized federated learning: A meta-learning approach[J]. arXiv preprint arXiv:2002.07948, 2020.
  21. Keçeci C, Shaqfeh M, Mbayed H, et al. Multi-Task and Transfer Learning for Federated Learning Applications[J]. arXiv preprint arXiv:2207.08147, 2022.

聚类联邦

  1. Ghosh A, Chung J, Yin D, et al. An efficient framework for clustered federated learning[J]. Advances in Neural Information Processing Systems, 2020, 33: 19586-19597.框架(谷歌引用312)
  2. Sattler F, Müller K R, Samek W. Clustered federated learning: Model-agnostic distributed multitask optimization under privacy constraints[J]. IEEE transactions on neural networks and learning systems, 2020, 32(8): 3710-3722.基于余弦相似度(谷歌引用420)

  1. Fallah A, Mokhtari A, Ozdaglar A. Personalized federated learning with theoretical guarantees: A model-agnostic meta-learning approach[J]. Advances in Neural Information Processing Systems, 2020, 33: 3557-3568.元学习方法(谷歌引用329)
  2. Zhang M, Sapra K, Fidler S, et al. Personalized federated learning with first order model optimization[J]. arXiv preprint arXiv:2012.08565, 2020.一阶模型优化方法(谷歌引用121)
  3. Huang Y, Chu L, Zhou Z, et al. Personalized cross-silo federated learning on non-iid data[C]//Proceedings of the AAAI Conference on Artificial Intelligence. 2021, 35(9): 7865-7873.(谷歌引用204)

四、纵向联邦学习

  1. Cheng K, Fan T, Jin Y, et al. Secureboost: A lossless federated learning framework[J]. IEEE Intelligent Systems, 2021, 36(6): 87-98.
  2. Johnson R, Zhang T. Accelerating stochastic gradient descent using predictive variance reduction[J]. Advances in neural information processing systems, 2013, 26.
  3. Hardy S, Henecka W, Ivey-Law H, et al. Private federated learning on vertically partitioned data via entity resolution and additively homomorphic encryption[J]. arXiv preprint arXiv:1711.10677, 2017.
  4. Gu B, Xu A, Huo Z, et al. Privacy-preserving asynchronous vertical federated learning algorithms for multiparty collaborative learning[J]. IEEE transactions on neural networks and learning systems, 2021.
  5. Zhang Q, Gu B, Deng C, et al. Secure bilevel asynchronous vertical federated learning with backward updating[C]//Proceedings of the AAAI Conference on Artificial Intelligence. 2021, 35(12): 10896-10904.
  6. Liu Y, Zhang X, Kang Y, et al. Fedbcd: A communication-efficient collaborative learning framework for distributed features[J]. IEEE Transactions on Signal Processing, 2022, 70: 4277-4290.

五、联邦平台

  1. Wang Z, Kuang W, Xie Y, et al. FederatedScope-GNN: Towards a Unified, Comprehensive and Efficient Package for Federated Graph Learning[J]. arXiv preprint arXiv:2204.05562, 2022.
  2. Lai F, Dai Y, Zhu X, et al. FedScale: Benchmarking model and system performance of federated learning[C]//Proceedings of the First Workshop on Systems Challenges in Reliable and Secure Federated Learning. 2021: 1-3.
  3. Liu Y, Fan T, Chen T, et al. FATE: An Industrial Grade Platform for Collaborative Learning With Data Protection[J]. J. Mach. Learn. Res., 2021, 22(226): 1-6.
  4. Ingerman, A., K. Ostrowski. TensorFlow Federated, 2019.(TFF)
  5. Ryffel T, Trask A, Dahl M, et al. A generic framework for privacy preserving deep learning[J]. arXiv preprint arXiv:1811.04017, 2018.(PySyft)
  6. Caldas S, Duddu S M K, Wu P, et al. Leaf: A benchmark for federated settings[J]. arXiv preprint arXiv:1812.01097, 2018.
  7. Ma Y, Yu D, Wu T, et al. PaddlePaddle: An open-source deep learning platform from industrial practice[J]. Frontiers of Data and Domputing, 2019, 1(1): 105-115.
  8. He C, Li S, So J, et al. Fedml: A research library and benchmark for federated machine learning[J]. arXiv preprint arXiv:2007.13518, 2020.
  9. Beutel D J, Topal T, Mathur A, et al. Flower: A friendly federated learning framework[J]. 2022.
  10. Chai D, Wang L, Chen K, et al. Fedeval: A benchmark system with a comprehensive evaluation model for federated learning[J]. arXiv preprint arXiv:2011.09655, 2020.

六、隐私保护

  1. Geiping, J., Bauermeister, H., Dröge, H., and Moeller, M. Inverting gradients - how easy is it to break privacy in federated learning? In NeurIPS, 2020.
  2. Wang, H., Sreenivasan, K., Rajput, S., Vishwakarma, H., Agarwal, S., yong Sohn, J., Lee, K., and Papailiopoulos, D. Attack of the tails: Yes, you really can backdoor federated learning. In NeurIPS, 2020.
  3. Geyer, R. C., Klein, T., and Nabi, M. Differentially private federated learning: A client level perspective. In NeurIPS, 2017.(差分隐私保护)
  4. Kairouz P, McMahan B, Song S, et al. Practical and private (deep) learning without sampling or shuffling[C]//International Conference on Machine Learning. PMLR, 2021: 5213-5225.(添加噪声)

七、补充

7.1 图像分类

  1. Li Q, He B, Song D. Model-contrastive federated learning[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2021: 10713-10722.对比学习,谷歌引用量231
  2. Zhang F, Kuang K, You Z, et al. Federated unsupervised representation learning[J]. arXiv preprint arXiv:2010.08982, 2020.联邦无监督对比学习(谷歌引用量59)
  3. …持续更新中

7.2 联邦图学习

传送

其它论文整理工作

你可能感兴趣的:(人工智能)