BatchNorm3D

neuralpy.layers.normalization.BatchNorm3D(num_features=None, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True, name=None)
info

BatchNorm3D is mostly stable and can be used for any project. In the future, any chance of breaking changes is very low.

Applies Batch Normalization over a 5D input.

To learn more about BatchNorm3D layers, please check pytorch documentation.

Supported Arguments

  • num_features: (Integer) C from an expected input of size (N,C,L) or L from input of size (N,L)
  • eps=1e-05: (Integer) A value added to the denominator for numerical stability.Default: 1e-5
  • momentum=0.1: (Integer) The value used for the running_mean and running_var computation. Can be set to None for cumulative moving average(i.e. simple average).
  • affine=True: (Integer) A boolean value that when set to True, this module has learnable affine parameters.
  • track_running_stats=True: (Boolean) A boolean value that when set to True, this module tracks the running mean and variance, and when set to False, this module does not track such statistics and always uses batch statistics in both training and eval modes.
  • name=None: (String) Name of the layer, if not provided then automatically calculates a unique name for the layer

Example Code

from neuralpy.models import Sequential
from neuralpy.layers.normalization import BatchNorm3D
from neuralpy.layers.normalization import Conv3D
# Making the model
model = Sequential()
model.add(Conv3D(filters=8, kernel_size=3, input_shape=(1, 28, 28, 28), stride=1, name="first cnn"))
model.add(BatchNorm3D(eps=1e-05, momentum=0.1, affine=True, track_running_stats=True, name="batch norm"))