Using Deep Learning is Easier Than You Think

Standard

I came across a great article on using the Deep Learning Python package tflearn to perform inference on some classic datasets in Machine Learning like the MNIST dataset and the CIFAR-10 dataset. As it turns out, these types of models have been around for quite a while for various tasks in image recognition. The particular case of the CIFAR-10 dataset was solved by a neural network very similar to the one from the mentioned post. The general idea of using convolutional neural networks dates back to Yann LeCun’s paper from 1998 in digit recognition.

Since I have a lot of experience working with data but not a lot of experience working with deep learning algorithms, I wondered how easy it would be to adapt these methods to a new, somewhat related, problem: going through thousands of my photos to identify pictures of my cat. Turns out, it was easier than I thought.

2012-12-30 09.58.28.jpg

This is definitely a cat photo

Training a similar neural network on my own visual data just amounted to connecting up the inputs and the outputs properly. In particular, any input image has to be re-scaled down (or up) to 32×32 pixels. Similarly, your output must be binary and should represent membership of either of the two classes.

The main difficulty involves creating your dataset. This really just means going through your images and classifying a subset of them by hand. For my own run at this, all I did was create a directory like:

images/
    cat/
    not_cat/

I put any cat photos I found into the cat directory while putting any non-cat photographs in the other folder. I tried to keep the same number of images in both directories to try to avoid any class imbalance problems. Then again, this wasn’t as much of a concern since roughly half my photos are cat photos anyway.

From there, tflearn has a helper method that lets you create an HDF5 dataset from your directory of images with a simple function. The X & Y values from that data structure can be used as the inputs to the deep learning model.

By using around 400 images (roughly 200 for each class), my classifier achieved about an 85% accuracy rate on a validation set of data. For my purposes, namely just automatically tagging potential photos of my cat, this was accurate enough. Any effort to increase the accuracy of this would probably involve some combination of:

  • adding more training data by putting images into my class folders
  • changing the shape of the network by adding more layers or more nodes per layer
  • using a pre-trained model to bootstrap the learning process

That’s all it really takes. If you know a bit of Python and can sort a few of your photos into folders based on their categories, you can get started using sophisticated deep learning algorithms on your own images.

You can find the code for this on my account at Github. If you want to chat or reach out at all, follow me on Twitter @mathcass.

Advertisements

Getting up and running with Python virtual environments

Standard

Python is a great tool to have available for all sorts of tasks, including data analysis or machine learning. It’s a great language to start off with if you’re a beginner, and there are loads of tutorials out there.  So, if you’re a neophyte Pythonista, head over there and come back here later.

Additionally, plenty of great developers have been working on tools that just get the job done, including pandas for wrangling your data (and turning it into something that looks like a spreadsheet), as well as Scikit-Learn for running anything from basic statistics to more complex learning algorithms on your data.

I’ve used Python for long enough to have made a lot of the mistakes there are with it, but the best piece of advice I have for anyone getting started is to use a virtual environment. You see, Python has some built-in tools that let you download and use other people’s code so you can leverage their work in your own analyses. Most of the time, this happens without a problem. But sometimes, say when a developer changes and updates his or her package in a way that breaks the way you’re using it, you’ll want to stick with the old version until you can try out the upgrade. Virtual environments provide a sandbox that allows you to keep different versions of Python modules separate so they can’t conflict with one another.

In fact, I’d actually suggest you do this in almost all contexts.

  • Are you starting a new project and have no idea where it’s heading? Use a virtual environment.
  • Are you setting up a production server so you can deploy and run your Python code? Use a virtualenvironment
  • Are you writing a research paper that analyzes some data that you’ll eventually publish? Use a virtual environment and then share how it’s set up with other people so they can reproduce your results

So how do you go about using a virtual environment? If you’re using Mac OS or a Linux distribution, one of my favorite tools is pyenv, which works quite seamlessly after you’ve install some dependencies (like some tools that actually build Python). Here’s the original guide I started off with and I still use it as a reference if I run into any issues. The thing I really love about pyenv is that it allows you to install and manage different Python versions as well.

One Windows, the experience is a bit different but I think this guide is great for getting started. That article focuses on setting up virtual environments but the earlier ones should help out with the installation process. Looking around, it seems that Python 3.3 has a tool that allows Windows users to switch between different Python versions, which is very nice to have. I haven’t checked it out yet but I look forward to.

In essence, if you haven’t tried out using virtual environments yet, get started as soon as you can. It’s worth the time invested in getting it set up and understanding a few things under the hood (like how the command-line PATH variable and the PYTHON_PATH work). All in all, I’ve never regretted setting one up for even the simplest of tasks and have almost always cursed myself when I didn’t use on.