Coloring B&W Images with Machine Learning

I’m finishing up my Machine Learning Nanodegree from Udacity, and I decided to train models to color black and white images for my final project. While this may sound daunting to some, this particular task has been done by many folks already. I’m following a tutorial posted by Emil Wallner with data that I graciously received from WW2DB. The tutorial is in turn based on another existing project called Deep Koalarization. Google is already beta testing their own colorization functionality for the masses, so chances are people won’t be banging on my door to use my models. However, I’m thrilled that I got a chance to try this out for myself.

Here’s code from Emil’s tutorial:
https://github.com/emilwallner/Coloring-greyscale-images/blob/master/Full-version/full_version.ipynb

Here’s a gist of what I tried:
https://gist.github.com/shabububu/95fbbed0e6ef4024f1c6d123bd25328f

While it’s not perfect, I’m pretty happy with the results right out of the box.

Original Iwo Jima photo from WW2DB, modified to be 256×256
Resulting image colored by the model I trained

I didn’t change much from the original tutorial. It’s still using the soon-to-be-deprecated TensorFlow 1.x, but I added a validation set to minimize the mean squared error along with a few other small things. The training data consists of 1,600 color images related to WWII. At least 75% of them were pre-1950. All of the images I received from WW2DB were modified to be 256×256 beforehand using ImageMagick. Of those 1600, about 5% were used for validation. Finally, 122 colored images were set aside for testing at the end. My script ran through the data about 250 times, and it took under 8 hours to train.

Here’s an example from my test set — the original colors, the colors stripped, and recolored with my model:

Original Colors
Colors Removed
Colored from Black and White with ML

Here’s one where there is no ground truth:

Black and White WWII Iwo Jima Photo (scaled to 256×256)
Colored with ML

The mind-blowing thing about this project is how easy it was for me to start training machine learning models for myself. A simple Google search found previous work from so many other people. I took a Udacity course for more structure in my pursuit of knowledge, but I could have easily gone through this tutorial without it and still have produced good results. I didn’t pay for compute time because I used Google Colab, which gave me access to a free Jupyter notebook environment hooked up to a GPU. I ended up being lucky enough to get a machine with 25+ GB of RAM and 300+ GB of storage. I did pay for extra Google Drive space ($20 per year), but that’s about it. It’s honestly amazing what we can do these days.

Python ucs2 vs ucs4 installation

For forcing ucs4 when installing Python:

./configure --enable-unicode=ucs4
make
make install

For forcing ucs2 when installing Python:

./configure --enable-unicode=ucs2
make
make install

Once you have it installed, a ucs4 version will produce the following:

bash$ python
Python 2.5.2 (r252:60911, Mar  4 2008, 10:58:18)
[GCC 3.4.5] on linux
>>>import sys
>>> print sys.maxunicode
1114111

And, a ucs2 version will produce the following:

>>>import sys
>>> print sys.maxunicode
65535