It’s called PyTorch. And it’s a tool designed to work perfectly with Python libraries like NumPy and Jupyter Notebooks to create deep neural networks. As it turns out, it is much easier to use and more intuitive than Google’s TensorFlow packages. In fact, I have been trying to get TensorFlow working on my Mac laptop for about a month, each time I run it, I get a new error, and when I fix that error, I encounter another, and another, until I eventually resign myself to never being able to train a neural network on my laptop.
Fortunately, compatibility is not the only thing that
All the squishification functions are built in to the PyTorch library, like
- Sigmoid: ,
- ReLU: , and
- Softmax: .
On top of that, you can define a
Here’s the code for the above neural network that I created. It takes a 784-bit image file, pumps it through two hidden layers and then to a 10-bit output where each one of the output nodes represents a digit (0-9). The images that are in the training set are all images of handwritten numbers between zero and nine, so, when trained, this neural network should be able to identify the number that was written automatically.
This few lines of code, executed on a server (or local host) produces the following output:
See how cool that is!? Oh… right. The network seems to have no clue which number it is. That’s because all we’ve done so far is a feedforward operation on an untrained neural network with a bunch of random numbers as the weights and zeroes as the biases.
In order to make this neural network do anything useful, we have to train it, and that involves another step, back-propagation. I’ll cover that in my next blog. For now, we’re left with a useless random distribution of numbers and weights and no idea what our output should be, enjoy!