LSTM
neuralpy.layers.recurrent.LSTM(hidden_size, num_layers=1, input_size=None, bias=True, batch_first=False, dropout=0, bidirectional=False, name=None)
danger
LSTM Layer is unstable and buggy, not ready for any real use
Applies a multi-layer long short-term memory (LSTM) RNN to an input sequence.
To learn more about LSTM, please check pytorch documentation
Supported Arguments:
hidden_size
: (Integer) The number of features in the hidden statenum_layers=1
: (Integer) Number of recurrent layersinput_size=None
: (Integer) The number of expected features in the inputbias=True
: (Boolean) If true then uses the bias,Defaults totrue
batch_first=False
: (Boolean) Iftrue
, then the input and output tensors are provided as (batch, seq, feature). Default:false
dropout=0
: (Integer) f non-zero, introduces a Dropout layer on the outputs of each RNN layer except the last layer,with dropout probability equal to dropout. Default: 0bidirectional=False
: (Boolean) Iftrue
, becomes a bidirectional RNN. Default:false
name=None
: (String) Name of the layer, if not provided then automatically calculates a unique name for the layer
Example Code
from neuralpy.models import Sequential
from neuralpy.layers.recurrent import LSTM
# Making the model
model = Sequential()
model.add(LSTM(hidden_size=256, num_layers=4, input_size=28))
model.add(LSTM(hidden_size=128, num_layers=2))