Menu
NeuralPy LogoNeuralpyDocsBlogContributorsDiscord
GitHub
🌜
🌞
NeuralPy LogoNeuralpy
  • Docs
  • Blog
  • Contributors
  • Discord
  • GitHub
  • NeuralPy
  • Installation
  • Get Started
  • Documentation
    • Callbacks
      • TrainLogger
    • Layers
      • Activation Functions
        • GELU
        • Leaky ReLU
        • ReLU
        • SELU
        • Sigmoid
        • Softmax
        • Tanh
      • Linear
        • Dense
        • Bilinear
      • Convolutional
        • Conv1D
        • Conv2D
        • Conv2D
      • Pooling
        • AvgPool1D
        • AvgPool2D
        • AvgPool1D
        • MaxPool1D
        • MaxPool2D
        • MaxPool3D
      • Normalization
        • BatchNorm1D
        • BatchNorm2D
        • BatchNorm3D
      • Recurrent
        • RNN
        • GRU
        • LSTM
        • RNNCell
        • GRUCell
        • LSTMCell
      • Regularizers
        • AlphaDropout
        • Dropout
        • Dropout2D
        • Dropout3D
      • Sparse
        • Embedding
      • Other
        • Flatten
    • Loss Functions
      • BCE Loss
      • Cross Entropy Loss
      • MSE Loss
    • Models
      • Model
      • Sequential
    • Optimizers
      • Adagrad Optimizer
      • Adam Optimizer
      • RMSProp Optimizer
      • Rprop Optimizer
      • SGD Optimizer
    • Utils
      • Custom Layer
  • Support

Support#

If you need help or you need some information regarding NeuralPy, then are the following steps that you can step to communicate with the team.

  1. Raise an issue on Github
  2. Join our discord server (https://discord.gg/6aTTwbW)
  3. Contact with Abhishek Chatterjee(abhishek.chatterjee97@protonmail.com)
Edit this page
Previous
« Custom Layer

Docs

  • Installation
  • Get Started

Community

  • Discord
  • Twitter

More

  • Blog
  • GitHub
Copyright © 2021 Abhishek Chatterjee (imdeepmind), Built with Docusaurus.