Federated learning and contrastive learning are two important machine learning paradigms, and federated contrastive learning is their combination, allowing pretraining models using unlabeled data in the context of data silos while protecting the data privacy and security. However, existing federated contrastive learning algorithms require transmission of model weights and data representations among the sever and clients, which is costly and greatly increases data security risks. Therefore, how to reduce the amount of data communication, especially the representations, while maintaining the performance of the model is an important problem that has not yet received any attention. In this study, we propose Federated Contrastive Learning Algorithm using Particle Swarm Optimization to reduce data communication by transmitting the client model scores instead. In the experiments, our proposed method can reduce 40% weight uploads and 78.97% representation uploads compared with the baseline algorithm, while with only 1.93% decrease in accuracy.
|Title of host publication||2021 3rd International Academic Exchange Conference on Science and Technology Innovation (IAECST)|
|ISBN (Electronic)||978-1-6654-0267-5, 978-1-6654-0266-8|
|Publication status||Published - 7 Dec 2022|
|Event||2021 3rd International Academic Exchange Conference on Science and Technology Innovation (IAECST) - Guangzhou, China|
Duration: 10 Dec 2021 → 12 Dec 2021
|Conference||2021 3rd International Academic Exchange Conference on Science and Technology Innovation (IAECST)|
|Period||10/12/21 → 12/12/21|