Authors: Alberto Marinelli, Martina Melero Cavallo
Our goal was to create from scratch an Artificial Neural Network able to solve both classification and regression problems. Although it works on different datasets, in our case it was tested on the monk problem and simple regression problems.
For this project, a Neural Network trained through a classical Back-Propagation (BP) approach and employing both the momentum gradient-based optimization technique and L2-regularization were implemented using the MATLAB programming language.
The creation of the weights of the Neural Network is implemented in the init.m function, which lets you choose your own architecture for the network by providing an array in which each element is the number of neurons for that specific layer (e.g. [2, 4, 1] creates a Neural Network in which the first layer has two neurons, the hidden layer has four neurons and there is a single output neuron; also, the biases are appropriately added).
The network can then be trained in batch using the train.m function by providing the weight matrices, the Training Set
The training calls, at each epoch, the feedforward.m function to compute the outputs of the network and the backpropagation.m algorithm to compute the gradient of the loss function with respect to the weights; ultimately, the weights are updated through update_weights.m.
In the update_weights.m function the weight changes