Matrix Cumulative Coherence: Fourier Bases, Random and Sensing Matrices

[This article was first published on Scientific Memo, and kindly contributed to R-bloggers]. (You can report issue about the content on this page here)
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.

Compressive sampling (CS) is revolutionizing the way we process analog to digital conversion, our understanding of linear systems and the limits of information theory. One of the key concept in CS is that a signal can be represented in a sparse bases or it is already sparse. The novelty of sparsity bases is that when signal is randomly sampled, in a nutshell it can be recovered (or a solution can be found to a linear problem) with fewer samples. A land mark example is being sparse MRI.

The concept of choosing “more” sparse bases or CS sensing matrix lies in the measure of mutual coherence. It is defined as follows for a given matrix at order k.
$$  M(A, k) = max_{p} max_{p \ne q, q \in \Omega } \sum_{q} | | / ( |a_{p}| |a_{q}|)$$ When k=1, it is easy to understand what it means. Basically we find the largest inner product among columns of the given matrix. Lower the value better the sparsity i.e. incoherence. However a single number may not be so informative, after all how low is better.  With the definition of David Donoho and Joel Tropp,  if M is slowly increasing then matrix said to be enchances sparsity. Larger value of k forms a set of columns $\Omega$, and the second colums are selected from this set i.e. second max argument in the above definition.

In a recent post I have shortly reviewed my R package for CS called R1magic. Its recent version 0.2 contains a functionality to compute $M(A, k)$.  Also now there is a public Github repository of the package (github R1magic). mutualCoherence function is written fully functional way. All operations for computing $M(A,k)$ performed in vectorial fashion in R, using function closures and apply. However, for much larger matrices, a low level implementation may be required.

Example

Here we shortly investigate coherence of Fourier, random and mixed bases in R.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
require("R1magic")
set.seed(42)
A <- DFTMatrix0(10) # Fourier Bases
B <- matrix(rnorm(100), 10, 10) # Gaussian Random Matrix
C <- A %*% B # A sensing matrix with A and B as above
aa<-mutualCoherence(A, 8)
bb<-mutualCoherence(A, 8)
bb<-mutualCoherence(A, 8)
aa
[1] 1 1 1 1 1 1 1 1
bb
[1] 0.6784574 1.2011489 1.7001046 2.1713561 2.4664608 2.7302690 2.7908302
[8] 2.9623327
cc
[1] 0.7506222 1.3448452 1.8047043 2.1105348 2.3350516 2.4703822 2.5898766
[8] 2.6882250
We observe that mixed bases, where we define so called a CS matrix, matrix $C$ in above notation, with Fourier ($A$) and Random basis ($B$). Its mutual coherence increases slower than the pure random matrix with unit measure. This may signify a better incoherent basis.  However, since this is a "syntetic data', it does not tell any universal behaviour at all. A real data or signal measurement matrix  shall be tested for a better judgement of which bases is better to use in sparse recovery.  A recent paper in optical tomography uses this concept to identify a better sensing matrix in tomographic image reconstruction.



To leave a comment for the author, please follow the link and comment on their blog: Scientific Memo.

R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you're looking to post or find an R/data-science job.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.

Never miss an update!
Subscribe to R-bloggers to receive
e-mails with the latest R posts.
(You will not see this message again.)

Click here to close (This popup will not appear again)