ActivationFunctions Save

Implementing activation functions from scratch in Tensorflow.

Project README

ActivationFunctions using Custom Layers in Keras

GitHub stars GitHub forks made-with-python GitHub license

Activation functions are an important are of deep learning research .Many new activation functions are being developed ,these include bio-inspired activtions, purely mathematical activation functions including others . Despite, such advancements we usually find ourselves using RELU and LeakyRELU commonly without using/thinking about others. In the following notebooks I showcase how easy/difficult it is to port an activation function using Custom Layers in Keras and Tensorflow!

Link to main notebook --> Activations.ipynb

Implemented activations:

  • LeakyReLu
  • ParametricReLu
  • Elu
  • SElu
  • Swish
  • GELU

Structure

src
|
|-- Activations.ipynb
|-- utils
     |-- Utils.ipynb
     |-- utils.py
     
references
|
|--Ref1
|--Refn

Usage

 git clone  https://github.com/Agrover112/ActivationFunctions.git

References

Open Source Agenda is not affiliated with "ActivationFunctions" Project. README Source: M-68/ActivationFunctions

Open Source Agenda Badge

Open Source Agenda Rating