SELU

neuralpy.layers.activation_functions.SELU(name=None)
info

SELU Activation Function is mostly stable and can be used for any project. In the future, any chance of breaking changes is very low.

Applied element-wise, as: SELU(x) = scale∗(max(0,x)+min(0,α∗(exp(x)−1))).

To learn more about SELU, please check PyTorch documentation

Supported Arguments

  • name: (String) Name of the activation function layer, if not provided then automatically calculates a unique name for the layer.

Example Code

from neuralpy.models import Sequential
from neuralpy.layers.linear import Dense
from neuralpy.layers.activation_functions import SELU
# Making the model
model = Sequential()
model.add(Dense(n_nodes=1, n_inputs=1, bias=True, name="Input Layer"))
model.add(SELU(name="selu"))