GELU
neuralpy.layers.activation_functions.GELU(name=None)
info
GELU Activation Function is mostly stable and can be used for any project. In the future, any chance of breaking changes is very low.
Applies the Gaussian Error Linear Units function: GELU(x) = x * Φ(x). where Φ(x) is the Cumulative Distribution Function for Gaussian Distribution.
To learn more about GELU, please check PyTorch documentation
Supported Arguments
name
: (String) Name of the activation function layer, if not provided then automatically calculates a unique name for the layer.
Example Code
from neuralpy.models import Sequential
from neuralpy.layers.linear import Dense
from neuralpy.layers.activation_functions import GELU
# Making the model
model = Sequential()
model.add(Dense(n_nodes=1, n_inputs=1, bias=True, name="Input Layer"))
model.add(GELU(name="myLayer"))