Hipster or Homeless — Can AI tell the difference?
Google has just released its version 2.0 for Teachable Machine, an awesome tool for anybody to experiment with Machine Learning in the browser.
Besides the default examples with webcam images and gestures, I was looking for something new. Something bigger! Solving a question, which is sometimes pretty hard to answer…
Is this guy homeless or just a hipster?
I know there is no need for such an AI model in the real world… but who cares? It’s a fun way to try Teachable Machine for a couple of minutes! ;-)
Collect training data
To fetch some example images for the training, we can use an extension called Fatkun Batch Download Image. This extension makes it pretty easy to download all images returned by an image search in Google. Just enter your query, e.g. “hipster man beard”, and hit the download button. It will create a folder labeled with your query inside your download folder.
Repeat this step for some queries and put all downloaded images into two separate folders called “hipster” and “homeless”.
Please keep in mind that we, the people who train neural networks, are responsible to keep ethical bias in mind! When you’re doing a search for “homeless guy”, there are a lot more images from persons of color. This is a sad truth… and could lead to unintentional results, where our model decides the class based on the color of the persons skin.
You are the one who is responsible to augment the training data and prevent ethical biases!
Think about these cases and try to keep it balanced by searching for “homeless white male” and “hipster black female” etc.
Training the model
To train our model, we just have to drag and drop all files from each class into the corresponding class in the UI. In my case, there were 430 images of hipsters and 276 images of homeless people. Google will handle image cropping by itself.