Softmax

neuralpy.layers.activation_functions.Softmax(dim=None, name=None)
info

Softmax Activation Function is mostly stable and can be used for any project. In the future, any chance of breaking changes is very low.

Applies the Softmax function to an n-dimensional input Tensor rescaling them so that the elements of the n-dimensional output Tensor lie in the range [0,1] and sum to 1.

To learn more about Softmax, please check PyTorch documentation

Supported Arguments

  • dim: (Integer) A dimension along which Softmax will be computed (so every slice along dim will sum to 1).
  • name: (String) Name of the activation function layer, if not provided then automatically calculates a unique name for the layer.

Example Code

from neuralpy.models import Sequential
from neuralpy.layers.linear import Dense
from neuralpy.layers.activation_functions import Softmax
# Making the model
model = Sequential()
model.add(Dense(n_nodes=1, n_inputs=1, bias=True, name="Input Layer"))
model.add(Softmax(name="softmax_layer"))