- Downlaod nnet.py
- Execute with
python nnet.py. code will automatically download dataset - Visualize with
tensorboard --logdir /tmp/tboard/nnet
You can expect an accuracy of about 97% + in test image.
That's It!
- tensorflow 1.4 (and tensorboard for visualization)
- developed and tested in python3.6
- you need to have write permission in /tmp (who doesn't)
Following is the input dataset used as an example. It is MNIST dataset containing hand written digits. The code utilized tensorflow inbuild modules to download the data in your working directory. Input are of 28x28x1 dimension that is flattened out before using in the model.
nist = mnist_data.read_data_sets("data",one_hot=True, reshape=False, validation_size=0)The model is a five layer Fully connected Neural Network. Softmax is used as a last layer for converting scores into class probability.
Five layers seems to be an overkill for this problem, but it is mostly for demonestration purpose. Following is the computational graph.
Zooming into one of the hidden layer helps understand the structure bettter (see below). Weights multiplied with the output of previous layer and biases are added. The resultant is then passed through ReLU activation funciton.
A decaying learning rate is used to converge faster. Note that when the learning rate was high, loss function was jumping around and as the learning rate started decreasing, loss function started to decrement in small steps and hence it became much smoother.
97% + test accuracy was achieved.





