Cross Entropy Loss
neuralpy.loss_functions.CrossEntropyLoss(weight=None, ignore_index=-100 reduction='mean')
info
Cross Entropy Loss is mostly stable and can be used for any project. In the future, any chance of breaking changes is very low.
Applies a Cross-Entropy Loss function to the model.
Cross-Entropy Loss automatically applies a Softmax Layer at the end of the model, so there is no need to add a Softmax layer.
For more information, check this page.
Supported Arguments
weight=None
: (Numpy Array | List) Manual rescaling of classesignore_index=-100
: (Integer) Specifies a target value that is ignored and does not contribute to the input gradient.reduction='mean'
: (String) Specifies the reduction that is to be applied to the output.
Code Example
import numpy as np
from neuralpy.models import Sequential
from neuralpy.optimizer import Adam
from neuralpy.loss_functions import BCELoss
...
# Rest of the imports
...
model = Sequential()
...
# Rest of the architecture
...
# Weight of different classes, here 3 is the number of classes
weight = np.ones([3])
model.compile(optimizer=Adam(), loss_function=CrossEntropyLoss(weight=weight, reduction='mean', pos_weight=None))