Leaky ReLU

neuralpy.layers.activation_functions.LeakyReLU(negative_slope=0.01, name=None)
info

Leaky ReLU Activation Function is mostly stable and can be used for any project. In the future, any chance of breaking changes is very low.

Applies the element-wise function: LeakyReLU(x) = max(0,x)+negative_slope∗min(0,x).

To learn more about LeakyReLU, please check PyTorch documentation

Supported Arguments

  • negative_slope: (Float) A negative slope for the LeakyReLU, default value is 0.01
  • name: (String) Name of the activation function layer, if not provided then automatically calculates a unique name for the layer.

Example Code

from neuralpy.models import Sequential
from neuralpy.layers.linear import Dense
from neuralpy.layers.activation_functions import LeakyReLU
# Making the model
model = Sequential()
model.add(Dense(n_nodes=1, n_inputs=1, bias=True, name="Input Layer"))
model.add(LeakyReLU(name="myFunction"))