Homework 2
In this assignment you will practice writing backpropagation code, and training Neural Networks. The goals of this assignment are as follows:
- understand Neural Networks and how they are arranged in layered architectures
- understand and be able to implement (vectorized) backpropagation
- implement various update rules used to optimize Neural Networks
- implement batch normalization for training deep networks
- implement dropout to regularize networks
- effectively cross-validate and find the best hyperparameters for Neural Network architecture
Setup
Get the starter code by cloning the hw2 github repository. This can be accomplished by executing the following command:
git clone https://github.com/comp150DL/hw2.git
Setup Virtualenv: If you have not created a virtualenv for handling the python dependencies related to this course, please follow the Virtualenv tutorial.
If you would like to work on the provided AWS instances, please follow the Tufts AWS tutorial for how to connect to your Jupyter Notebook remotely.
To satisfy all software dependencies, start your virtualenv and double check that all required packages are installed:
workon deep-venv
cd hw2
pip install -r requirements.txt
Download data: Once you have the starter code, you
will need to download the CIFAR-10 dataset. Run the following from
the hw2
directory:
cd datasets
./get_datasets.sh
Start Jupyter Notebook: After you have the
CIFAR-10 data, you should start the Jupyter Notebook server from the
hw2
directory. If you
are unfamiliar with Jupyter, you should read the
Jupyter tutorial.
Submitting your work
To make sure everything is working properly, remember to do
a clean run (“Kernel -> Restart & Run All”) after you finish
work for each notebook and submit the final version with all
the outputs. Once you are done working, compress all the code and
notebooks in a single file and submit your archive by emailing to comp150dl@gmail.com.
On Linux or macOS
you can run the
provided collectSubmission.sh
script from hw2/
to
produce a
file hw2.zip
(or
hw2.tar.gz
if zip is not on your system) .
Q1: Fully-connected Neural Network (40 points)
The Jupyter notebook FullyConnectedNets.ipynb
will introduce you to our
modular layer design, and then use those layers to implement fully-connected
networks of arbitrary depth. To optimize these models you will implement several
popular update rules.
Q2: Batch Normalization (40 points)
In the Jupyter notebook BatchNormalization.ipynb
you will implement batch
normalization, and use it to train deep fully-connected networks.
Q3: Dropout (20 points)
The Jupyter notebook Dropout.ipynb
will help you implement Dropout and explore
its effects on model generalization.
Q4: Do something extra! (up to +10 points)
In the process of training your network, you should feel free to implement anything that you want to get better performance. You can modify the solver, implement additional layers, use different types of regularization, use an ensemble of models, or anything else that comes to mind. If you implement these or other ideas not covered in the assignment then you will be awarded some bonus points.