Optimizer Visualization Save Abandoned

Visualize Tensorflow's optimizers.

Project README

optimizer-visualization

Visualize gradient descent optimization algorithms in Tensorflow.

All methods start at the same location, specified by two variables. Both x and y variables are improved by the following Optimizers:

Adadelta documentation

Adagrad documentation

Adam documentation

Ftrl documentation

GD documentation

Momentum documentation

RMSProp documentation

For an overview of each gradient descent optimization algorithms, visit this helpful resource.

Numbers in figure legend indicate learning rate, specific to each Optimizer.

Note the optimizers' behavior when gradient is steep.

Note the optimizers' behavior when initial gradient is miniscule.

Inspired by the following GIFs:

From here

Open Source Agenda is not affiliated with "Optimizer Visualization" Project. README Source: Jaewan-Yun/optimizer-visualization

Open Source Agenda Badge

Open Source Agenda Rating