General Purpose
Neural Networks

Neural Network

To parameterize a neural network, we use a configuration based on ludwig (opens in a new tab). The main building block for neural networks is tb.NNLayer() which implements a fully connected layer. The parameters of NNLayer are,

  • output_size(Default 64) → Output size of a fully connected layer.
  • residual_connections(list[int]) → List of indices of for layers with which to establish residual_connections.
  • activation(Default: relu) → Default activation function applied to the output of the fully connected layers. Options are elu, leakyRelu, logSigmoid, relu, sigmoid, tanh, and softmax.
  • dropout(Default 0.3) → Default dropout rate applied to fully connected layers. Increasing dropout is a common form of regularization to combat overfitting. The dropout is expressed as the probability of an element to be zeroed out (0.0 means no dropout)
  • use_bias(Default: True) → Whether the layer uses a bias vector. Options are True and False.

Parameters

  • dropout(Default 0) → Dropout value to use for the overall model.
  • layers(list[NNLayer]) → Neural Network layers. By default, we pass 3 layers. [tb.NNLayer(), tb.NNLayer(), tb.NNLayer(output_size=1, activation="sigmoid")]
  • loss_function(Default: mse) → Which loss function to optimize. Options are l1, mse, cross_entropy, nll, poisson_nll and bce.
  • learning_rate(Default: 1e-2) → Initial learning rate.
  • optimizer(Default: sgd) → Which optimizer to use. Options are sgd and adam.

Example Usage

We can create an instance and deploy Neural Network model like this.

import turboml as tb
model = tb.NeuralNetwork(layers=[tb.NNLayer(), tb.NNLayer(), tb.NNLayer(output_size=1, activation="sigmoid")])