ICA on Images with Python

June 23, 2018

(This article was first published on Open Source Automation, and kindly contributed to R-bloggers)

ica on images with python

Click here to see my recommended reading list.

What is Independent Component Analysis (ICA)?

If you’re already familiar with ICA, feel free to skip below to how we implement it in Python.

ICA is a type of dimensionality reduction algorithm that transforms a set of variables to a new set of components; it does so such that that the statistical independence between the new components is maximized. This is similar to Principle Component Analysis (PCA), which maps a collection of variables to statistically uncorrelated components, except that ICA goes a step further by maximizing statistical independence rather than just developing components that are uncorrelated.

Like other dimensionality reduction methods, ICA seeks to reduce the number of variables in a set of data, while retaining key information. In the example we lay out in this post, the variables represent pixels in an image. One of the motivations behind using ICA on images is to perform image compression i.e. rather than storing thousands or even millions of pixels in a image, the storage of the independent components takes up much less memory. Also, by its nature, ICA extracts the independent components of images — which means that it will find the curves and edges within an image. For example, in facial recognition, ICA will identify the eyes, the nose, the mouth etc. as independent components.

ICA can be implemented in several open source languages, including Python, R, and Scala. This post will show you how to do ICA in Python with scikit-learn.

For more information on the mathematics behind ICA and how it functions as an algorithm, see here. Also, for a contrast between ICA and PCA, check out this Udacity video.

ICA with Python

First, let’s load the packages we’ll need. The main functionality we want is the FastICA method available from sklearn.decomposition. We’ll also load the skimage package, which we’ll use to read in a sample image, and pylab which will show the image to the screen (you may need this if you’re using an IPython Notebook).

# load packages
from sklearn.decomposition import FastICA
from pylab import *
from skimage import data, io, color

Next, we read in the image. We will set the parameter, as_grey, equal to True. This will make every pixel in the image a value between 0 and 255, rather than a 3-dimensional RGB value. For more information, see this link.

emc2_image = io.imread("emc2.png", as_grey = True)

Now, we choose a number of components we want, and use that number to create a FastICA object. In the sample below, we’ll create a FastICA object with 10 components. This will allow us to run ICA on our image, resulting in 10 independent components.

ica = FastICA(n_components = 10)

Then, we use our object, ica, to run the ICA algorithm on the image.

# run ICA on image

An important test when doing any type of dimensionality reduction to test how much information has been lost. In our example, we will reconstruct the image with the independent components — i.e. how does the image look if we only know the 10 independent components we’ve developed?

# reconstruct image with independent components
emc2_image_ica = ica.fit_transform(emc2_image)
emc2_restored = ica.inverse_transform(emc2_image_ica)

# show image to screen

As you can see, using just 10 independent components still shows a very recognizable version of our original picture. What happens if we change the number of components?

One Component

Three Components

Five Components

Ten Components

Twenty Components

By five independent components, our image is fairly recognizable. After twenty components, our image looks very similar to the original version.

ICA has many other applications, including analyzing stock market prices, facial recognition, and more.

The post ICA on Images with Python appeared first on Open Source Automation.

To leave a comment for the author, please follow the link and comment on their blog: Open Source Automation.

R-bloggers.com offers daily e-mail updates about R news and tutorials on topics such as: Data science, Big Data, R jobs, visualization (ggplot2, Boxplots, maps, animation), programming (RStudio, Sweave, LaTeX, SQL, Eclipse, git, hadoop, Web Scraping) statistics (regression, PCA, time series, trading) and more...

If you got this far, why not subscribe for updates from the site? Choose your flavor: e-mail, twitter, RSS, or facebook...

Comments are closed.

Search R-bloggers


Never miss an update!
Subscribe to R-bloggers to receive
e-mails with the latest R posts.
(You will not see this message again.)

Click here to close (This popup will not appear again)