FastActivations.jl

A collection of activation function approximations for Flux.
Author NTimmons
Popularity
8 Stars
Updated Last
6 Months Ago
Started In
December 2019

FastActivations.jl

A collection of activation function approximations for Flux.

In some models the accuracy of the sigmoid and tanh functions can be reduced without a loss of accuracy in the training process. Switching to an approximation can reduce training times significantly for some models.

Sigmoid Approximations

For sigmoid we provide fitted approximations using Taylor and Pade curve fit models as well as an implementation which uses a fast exp imeplemention based on an approximation of the formula: exp(x) = limn->inf (1 + x/n)

Fitted Functions Fast Expr
alt text alt text
alt text

Theano Sigmoid

There is also an implementation of TheanoFastSigmoid which is currently accepted in the Theano project. It is here mostly for comparison because it is both slower and less accurate than other Sigmoid approximations.

Tanh Approximations

For tanh we provide fitted approximations using Taylor and Pade curve fit models as well as an implementation based on the continuous fraction approximation of tanh.

alt text

Additionally we also provide the serpentine function.

Fitted Functions Continuous Fraction Serpentine
alt text alt text alt text
alt text alt text