smlp

Classes

SMLP

Simple Multi-Layer Perceptron used in the project.

Module Contents

class smlp.SMLP(input_size=784, hidden_size=77, output_size=6)

Bases: torch.nn.Module

Simple Multi-Layer Perceptron used in the project.

The network has four linear layers. The first three are followed by ReLU activations and the final layer returns logits. The implementation expects flattened inputs of shape (batch_size, input_size) (for MNIST: input_size=784).

Parameters

input_sizeint

Dimensionality of the flattened input (default: 784 for 28x28 images).

hidden_sizeint

Number of neurons in the hidden layers (default: 77 in this project).

output_sizeint

Number of output classes (default: 6 for digits 4..9 remapped to 0..5).

layer1
layer2
layer3
layer4
relu
forward(x: torch.Tensor) torch.Tensor

Forward pass.

Parameters

xtorch.Tensor

Input tensor of shape (batch_size, input_size). The function does not perform flattening; caller must supply a flattened tensor.

Returns

torch.Tensor

Logits tensor of shape (batch_size, output_size).