Noisy K FAC Save

Natural Gradient, Variational Inference

Project README

noisy K-FAC

The major contributors of this repository include Guodong Zhang and Shengyang Sun. Note that this repo uses a modified version of tensorflow K-FAC.

Update

A new repo was released with implementations of noisy K-FAC and noisy EK-FAC.

Introduction

This repository contains the code to reproduce the classification results from the paper Noisy Natural Gradient as Variational Inference Paper, Video. (RL code see VIME-NNG)

Noisy Natural Gradient: Variational Inference can be instantiated as natural gradient with adaptive weight noise. By further approximating full Fisher with K-FAC, we get noisy K-FAC, a surprisingly simple variational training algorithm for Bayesian Neural Nets. Noisy K-FAC not only improves the classification accuracy, but also gives well-calibrated prediction.

Now, the implementation of convolution with multiple samples (which is very useful for Bayesian Neural Nets) is messy and slow, we plan to implement a new operation in tensorflow after NIPS.

Citation

To cite this work, please use

@article{zhang2017noisy,
  title={Noisy Natural Gradient as Variational Inference},
  author={Zhang, Guodong and Sun, Shengyang and Duvenaud, David and Grosse, Roger},
  journal={arXiv preprint arXiv:1712.02390},
  year={2017}
}

Dependencies

This project uses Python 3.5.2. Before running the code, you have to install

Example

python main.py --config configs/kfac_plain.json

Tensorboard Visualization

This implementation allows for the beautiful Tensorboard visualization. All you have to do is to launch Tensorboard from your experiment directory located in experiments/.

tensorboard --logdir=experiments/cifar10/noisy-kfac/summaries
Open Source Agenda is not affiliated with "Noisy K FAC" Project. README Source: gd-zhang/noisy-K-FAC

Open Source Agenda Badge

Open Source Agenda Rating