Training Backpropagation Neural Nets on the handwritten digits dataset

  1. There are 5000 examples of handwritten digits in the dataset. Some of the examples are shown below. We have 10 class labels here: the digits 0:9. Given an image we want to classify it as one of the 10 classes.

    digits.png

  2. The next set of equations show how the feed-forward and back-propagation learning happens in a neural net with a single hidden layer.

    n1n2n3

  3. The next figure shows how the cost function decreases with #iterations of the numerical optimization algorithm used (conjugate gradient) with 25 hidden units in the hidden layer.

    gd.png

  4. The next figures show the hidden features learnt for different number of hidden units in the neural net.

    h25h36h49h64h81h100

  5. The next figure shows the accuracy on the training dataset obtained with different number of hidden units in the neural net. Ideally we should expect a strict increase in accuracy on the training set (leading to overfitting),  but there is a slight zig-zag pattern that can be attributed to the random initialization of the weights and the regularization.

    ac.png

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s