Codebase for "A deep cut into Split Federated Self-supervised Learning" This repository implements Momentum-Aligned contrastive Split Federated Learning (MonAcoSFL).
It is an extension of MocoSFL, which keeps the online/momentum client models aligned during parameter synchronizations. This stabilizes the training and yields vastly improved results in deeper cut-layers that are more communication-efficient.
Python > 3.7 with Pytorch, Torchvision
/run_monacosfl.py
-- Source files for MonAcoSFL
/scripts
-- Evaluation scripts used on a single-GPU machine
This repository is based on the original MocoSFL codebase. We would like to thank the authors for open-sourcing it.