ReLU

neuralpy.layers.activation_functions.ReLU(name=None)
info

ReLU Activation Function is mostly stable and can be used for any project. In the future, any chance of breaking changes is very low.

Applies the rectified linear unit function element-wise: ReLU(x) = max(0,x).

To learn more about ReLU, please check PyTorch documentation

Supported Arguments

  • name: (String) Name of the activation function layer, if not provided then automatically calculates a unique name for the layer.

Example Code

from neuralpy.models import Sequential
from neuralpy.layers.linear import Dense
from neuralpy.layers.activation_functions import ReLU
# Making the model
model = Sequential()
model.add(Dense(n_nodes=1, n_inputs=1, bias=True, name="Input Layer"))
model.add(ReLU(name="reluActivation"))