Federated Learning Save

A PyTorch Implementation of Federated Learning http://doi.org/10.5281/zenodo.4321561

Project README

Federated Learning DOI

This is partly the reproduction of the paper of Communication-Efficient Learning of Deep Networks from Decentralized Data
Only experiments on MNIST and CIFAR10 (both IID and non-IID) is produced by far.

Note: The scripts will be slow without the implementation of parallel computing.

Requirements

python>=3.6
pytorch>=0.4

Run

The MLP and CNN models are produced by:

python main_nn.py

Federated learning with MLP and CNN is produced by:

python main_fed.py

See the arguments in options.py.

For example:

python main_fed.py --dataset mnist --iid --num_channels 1 --model cnn --epochs 50 --gpu 0

--all_clients for averaging over all client models

NB: for CIFAR-10, num_channels must be 3.

Results

MNIST

Results are shown in Table 1 and Table 2, with the parameters C=0.1, B=10, E=5.

Table 1. results of 10 epochs training with the learning rate of 0.01

Model Acc. of IID Acc. of Non-IID
FedAVG-MLP 94.57% 70.44%
FedAVG-CNN 96.59% 77.72%

Table 2. results of 50 epochs training with the learning rate of 0.01

Model Acc. of IID Acc. of Non-IID
FedAVG-MLP 97.21% 93.03%
FedAVG-CNN 98.60% 93.81%

Ackonwledgements

Acknowledgements give to youkaichao.

References

McMahan, Brendan, Eider Moore, Daniel Ramage, Seth Hampson, and Blaise Aguera y Arcas. Communication-Efficient Learning of Deep Networks from Decentralized Data. In Artificial Intelligence and Statistics (AISTATS), 2017.

Cite As

Shaoxiong Ji. (2018, March 30). A PyTorch Implementation of Federated Learning. Zenodo. http://doi.org/10.5281/zenodo.4321561

Open Source Agenda is not affiliated with "Federated Learning" Project. README Source: shaoxiongji/federated-learning
Stars
1,155
Open Issues
14
Last Commit
5 months ago
License
MIT

Open Source Agenda Badge

Open Source Agenda Rating