I Built a Neural Net That Knows What Clothes You’re Wearing

Okay, maybe that is a bit of a click-baitey headline. What I really did was program a neural network with Pytorch that is able to distinguish between ten different clothing items that could present in a 28×28 image. To me, that’s still pretty cool.

Here’s an example of one of the images that gets fed into the program:

Yes, this is an image of a shirt.

Can you tell what this is? Looks kind of like a long-sleeve t-shirt to me, but it is so pixelated that I can’t really tell. But that doesn’t matter. What matters is what my trained neural-net thinks it is and if that’s what it actually is.

After training on a subset of images like this (the training set is about 750 images) for about 2 minutes, my model was able to choose the correct classification for any image that I fed in about 84.3% of the time. Not bad for a first go at building a clothing classifying deep neural net.

Below I have included the code that actually generates the network and runs a forward-pass through it:


class Network(nn.Module):
    def __init__(self, input_size, output_size, hidden_layers, drop_p=0.5):
        ''' Builds a feedforward network with arbitrary hidden layers.
       
            Arguments
            ---------
            input_size: integer, size of the input
            output_size: integer, size of the output layer
            hidden_layers: list of integers, the sizes of the hidden layers
            drop_p: float between 0 and 1, dropout probability
        '''
        super().__init__()
        # Add the first layer, input to a hidden layer
        self.hidden_layers = nn.ModuleList([nn.Linear(input_size, hidden_layers[0])])
       
        # Add a variable number of more hidden layers
        layer_sizes = zip(hidden_layers[:-1], hidden_layers[1:])
        self.hidden_layers.extend([nn.Linear(h1, h2) for h1, h2 in layer_sizes])
       
        self.output = nn.Linear(hidden_layers[-1], output_size)
       
        self.dropout = nn.Dropout(p=drop_p)
       
    def forward(self, x):
        ''' Forward pass through the network, returns the output logits '''
       
        # Forward through each layer in `hidden_layers`, with ReLU activation and dropout
        for linear in self.hidden_layers:
            x = F.relu(linear(x))
            x = self.dropout(x)
       
        x = self.output(x)
       
        return F.log_softmax(x, dim=1)

After training the network using a method called backpropagation and gradient descent (code below), the network successfully classified the vast majority of the images that I fed in, in less than half a second. Mind you, these were grayscale images, formatted in a simple way and trained with a large enough dataset to ensure reliability.

If you want a good resource to explain what backpropagation actually does, check out another great video by 3 Blue 1 Brown below:

So, what does this all look like? Is it all sci-fi futuristic and with lots of beeps and boops? Well… not exactly. Here’s the output of the program:

Output of my clothing-classifier neural net. Provides a probability that the photo is one of the 10 items listed.

The software grabs each image in the test set, runs it through a forward pass of the network and ends up spitting out a probability for each image. Above, you can see that the network thinks that this image is likely a coat. I personally can’t distinguish if it is a coat, a pullover or just a long-sleeve shirt, but the software seems about 85% confident that it is, in fact, a coat.

Overall, it’s pretty awesome that after only a few weeks of practice (with most of that time spent learning how to program in python) I can code my very own neural networks and they actually work!

If you’re interested, here’s a video of the neural network training itself and running through a few test images:

If you’d like to test out the code for yourself, here’s a link to my GitHub page where you can download all the files you need to get it running. Search Google if you can’t figure out how to install Python and run a Jupyter Notebook.

That’s all for now! See you soon 🙂