FedCGS: Leverage Feature Compression and Conditional GAN for Privacy Protection and Efficient Communication in Federated Learning

Main Article Content

Haoxuan Sun

Abstract

Federated learning is a distributed machine learning technique that can train models from decentralized data, collaboratively building central models and protecting data privacy by passing partial model updates rather than raw data in each iteration of model learning. However, recent work has shown that the information during the exchange between the server and the client is still vulnerable to gradient-based privacy attacks, and traditional privacy protection methods can incur significant costs (such as communication consumption and performance evaluation). Therefore, we need an algorithm that can protect data privacy while reducing communication costs and performance losses. Here, we propose a federated learning framework FedCGS that leverages conditional generation adversarial networks and feature compression in order to achieve privacy protection while maintaining competitive model performance and effectively improving communication efficiency. We divided the local network of each client into feature extractor and public classifier, and kept the extractor local to protect privacy, and sent  classifier and client generator to the server for aggregation and update through singular value decomposition, so as to improve the performance of the local network and efficient communication. A lot of experimental data prove that FedCGS can improve the performance of local model better than traditional federation learning, and improve the communication efficiency better than common conditional generation adversarial network. Therefore, we believe that FedCGS has better federal learning potential

Article Details

Section
Articles