Skip to content

Activation function design: Linearity and effective initialization

License

Notifications You must be signed in to change notification settings

Cross-Caps/AFLI

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

34 Commits
 
 
 
 
 
 
 
 

Repository files navigation

AFLI ❄️

GitHub python colab tensorflow PyPI

Activation Function design: Linearity and effective Initialization

Code to reproduce plots from the paper: Activation function design for deep networks: linearity and effective initialization https://arxiv.org/abs/2105.07741

Colab Notebooks

  • For implementation of Variance and Covariance maps; see Notebook
  • For analysing an activation function via correlation, moment ratio and dynamical isometry bounds; see Notebook

Training and Testing

  • For training a DNN model on MNIST/Fashion-MNIST/Cifar-10 dataset; see script
    • require pre-computed values of for a given ; use functions from Notebook
  • For training with fixed value of parameter 'a' of an activation function; import script
    • train and trainO functions are provided to train with Gaussian or orthogonal weights
  • For training with variable value of parameter 'a' of an activation function; import script
    • Only implementation with htanh activation is provided for demo purpose

Contact

Vinayak Abrol [email protected]

Michael Murray [email protected]

About

Activation function design: Linearity and effective initialization

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published