Sigmoid

neuralpy.layers.activation_functions.Sigmoid(name=None)
info

Sigmoid Activation Function is mostly stable and can be used for any project. In the future, any chance of breaking changes is very low.

Applies the element-wise function: Sigmoid(x)=σ(x)= 1/1+exp(−x).

To learn more about Sigmoid, please check PyTorch documentation

Supported Arguments

  • name: (String) Name of the activation function layer, if not provided then automatically calculates a unique name for the layer.

Example Code

from neuralpy.models import Sequential
from neuralpy.layers.linear import Dense
from neuralpy.layers.activation_functions import Sigmoid
# Making the model
model = Sequential()
model.add(Dense(n_nodes=1, n_inputs=1, bias=True, name="Input Layer"))
model.add(Sigmoid(name="sigmoid_layer"))