- This is a working implementation of a vectorized fully-connected neural network in NumPy
- Backpropagation algorithm is implemented in a full-vectorized fashion over a given minibatch
- This enables us to take advantage of powerful built-in NumPy APIs (and avoid clumsy nested loops!), consequently improving training speed
- Backpropagation code lies in the method
take_gradient_step_on_minibatch
of classNeuralNetwork
(seesrc/neural_network.py
) - Refer to in-code documentation and comments for description of how the code is working
- Directory
src/
contains the implementation of neural networkssrc/neural_network.py
contains the actual implementation of theNeuralNetwork
class (including vectorized backpropagation code)src/activations.py
andsrc/losses.py
contain implementations of activation functions and losses, respectivelysrc/utils.py
contains code to display confusion matrix
main.py
contains driver code that trains an example neural network configuration using theNeuralNetwork
class
- To download MNIST data, install python-mnist through
git clone
method (run the script to download data; ensurepython-mnist
directory exists inside the root directory of this project)
- Implement class NeuralNetwork
- Implement common activation and loss functions
- Test implementation on MNIST data