HW 1
In this assignment you will practice putting together a simple image classification pipeline, based on the k-Nearest Neighbor or the SVM/Softmax classifier. The goals of this assignment are as follows:
- understand the basic Image Classification pipeline and the data-driven approach (train/predict stages)
- understand the train/val/test splits and the use of validation data for hyperparameter tuning.
- develop proficiency in writing efficient vectorized code with numpy
- implement and apply a k-Nearest Neighbor (kNN) classifier
- implement and apply a Multiclass Support Vector Machine (SVM) classifier
- implement and apply a Softmax classifier
- implement and apply a Two layer neural network classifier
- understand the differences and tradeoffs between these classifiers
- get a basic understanding of performance improvements from using higher-level representations than raw pixels (e.g. color histograms, Histogram of Gradient (HOG) features)
Setup
Get the starter code by cloning the hw1 github repository. This can be accomplished by executing the following command:
git clone https://github.com/comp150DL/hw1.git
Setup Virtualenv: If you have not created a virtualenv for handling the python dependencies related to this course, please follow the Virtualenv tutorial.
If you would like to work on the provided AWS instances, please follow the Tufts AWS tutorial for how to connect to your Jupyter Notebook remotely.
To satisfy all software dependencies, start your virtualenv and double check that all required packages are installed:
workon deep-venv
cd hw1
pip install -r requirements.txt
Download data: Once you have the starter code, you
will need to download the CIFAR-10 dataset. Run the following from
the hw1
directory:
cd datasets
./get_datasets.sh
Start Jupyter Notebook: After you have the
CIFAR-10 data, you should start the Jupyter Notebook server from the
hw1
directory. If you
are unfamiliar with Jupyter, you should read the
Jupyter tutorial.
Submitting your work
To make sure everything is working properly, remember to do
a clean run (“Kernel -> Restart & Run All”) after you finish
work for each notebook and submit the final version with all
the outputs. Once you are done working, compress all the code and
notebooks in a single file and submit your archive by emailing to comp150dl@gmail.com.
On Linux or macOS
you can run the
provided collectSubmission.sh
script from hw1/
to
produce a
file hw1.zip
(or
hw1.tar.gz
if zip is not on your system) .
Q1: k-Nearest Neighbor classifier (20 points)
The Jupyter Notebook knn.ipynb will walk you through implementing the kNN classifier.
Q2: Training a Support Vector Machine (25 points)
The Jupyter Notebook svm.ipynb will walk you through implementing the SVM classifier.
Q3: Implement a Softmax classifier (20 points)
The Jupyter Notebook softmax.ipynb will walk you through implementing the Softmax classifier.
Q4: Two-Layer Neural Network (25 points)
The Jupyter Notebook two_layer_net.ipynb will walk you through the implementation of a two-layer neural network classifier.
Q5: Higher Level Representations: Image Features (10 points)
The Jupyter Notebook features.ipynb will walk you through this exercise, in which you will examine the improvements gained by using higher-level representations as opposed to using raw pixel values.
Q6: Cool Bonus: Do something extra! (+10 points)
Implement, investigate or analyze something extra surrounding the topics in this assignment, and using the code you developed. For example, is there some other interesting question we could have asked? Is there any insightful visualization you can plot? Or anything fun to look at? Or maybe you can experiment with a spin on the loss function? If you try out something cool we’ll give you up to 10 extra points and may feature your results in the lecture.