There are many compressed sensing (CS) recovery algorithms that have been proposed. Here I list some of them, and provide the corresponding experimental results. Essentially, the recovery algorithms are similar to the sparse coding based on the over-complete dictionary. The corresponding MATLAB Codes of the following algorithms can be downloaded from my homepage, http://home.ustc.edu.cn/~roy.
1. Orthogonal Matching Pursuit (OMP) [1]
![[转载]CS <wbr>recovery <wbr>algorithms(OMP,ROMP,IRLS, <wbr>GBP,...) [转载]CS <wbr>recovery <wbr>algorithms(OMP,ROMP,IRLS, <wbr>GBP,...)](https://i-blog.csdnimg.cn/blog_migrate/a28caec4c40133938991c6066c8a5ea8.jpeg)
Fig. 1. Recovery result by using OMP
I employ the OMP to recovery an image in the column-by-column fashion, i.e. process each column separately. The recovery result is shown in Fig. 1. The advantages of OMP are easy implementation and fast speed.
2. Compressive Sampling Matching Pursuit (CoSaMP) [2]
The foundation of this algorithm is that “For an s-sparse signal x, the vector y can serve as a proxy for the signal because the energy in each set of s components of y approximates the energy in the corresponding s components of x. In particular, the largest s entries of the proxy y point toward the largest s entries of the signal x.”
![[转载]CS <wbr>recovery <wbr>algorithms(OMP,ROMP,IRLS, <wbr>GBP,...) [转载]CS <wbr>recovery <wbr>algorithms(OMP,ROMP,IRLS, <wbr>GBP,...)](https://i-blog.csdnimg.cn/blog_migrate/5d3576b08ac12e2b4300d31a2081471f.jpeg)
Fig. 2. Recovery result by using CoSaMP
Fig. 2 illustrates the recovered result of CoSaMP. Specifically, the image is recovered in column-by-column fashion, so as to reduce the complexity. My experimental results depict that the OMP has a better performance than the CoSaMP, while the CoSaMP is much more complex than the OMP in the sense of computation.
3. Greedy Basis Pursuit (GBP) [3]
The authors declared that the GBP could discard some inappropriate atoms from the selected set of atoms. Note that, this property is not reflected in the following description.
![[转载]CS <wbr>recovery <wbr>algorithms(OMP,ROMP,IRLS, <wbr>GBP,...) [转载]CS <wbr>recovery <wbr>algorithms(OMP,ROMP,IRLS, <wbr>GBP,...)](https://i-blog.csdnimg.cn/blog_migrate/0e2ed039ea8b910017446220d9404366.jpeg)
Fig. 3. Recovery result by using GBP
According to my experimental results, the performance of GBP, shown in Fig. 3, is better than that of OMP and CoSaMP. However, the GBP is much more complex than OMP and CoSaMP as well.
4. Iteratively Reweighted Least Square (IRLS) [4]
![[转载]CS <wbr>recovery <wbr>algorithms(OMP,ROMP,IRLS, <wbr>GBP,...) [转载]CS <wbr>recovery <wbr>algorithms(OMP,ROMP,IRLS, <wbr>GBP,...)](https://i-blog.csdnimg.cn/blog_migrate/3f92d1726a1287ae63d40a0e034c62d8.jpeg)
Fig. 4. Recovery result by using IRLS
Fig. 4 shows the recovery result based on this algorithm, as can be seen, there are some improvements comparing with the previous algorithms, in the sense of PSNR.
5. Suspace Pursuit (SP) [5]
As declared by the authors, the main difference between SP and CoSaMP is the manner of adding new candidates. More precisely, SP only adds K new candidates in each iteration, while CoSaMP adds 2K, which makes the SP computationally more efficient, but the underlying analysis more complex.
![[转载]CS <wbr>recovery <wbr>algorithms(OMP,ROMP,IRLS, <wbr>GBP,...) [转载]CS <wbr>recovery <wbr>algorithms(OMP,ROMP,IRLS, <wbr>GBP,...)](https://i-blog.csdnimg.cn/blog_migrate/39de0ac2b6fd37407f141b3846733ecd.jpeg)
Fig. 5. Recovery result by using SP
Fig. 5 illustrates the recovery result by using SP, as can be seen, the performance is better than that of CoSaMP in the sense of PSNR.
6. Regularized Orthogonal Matching Pursuit (ROMP) [6]
This algorithm is similar to OMP and SP, however, the main difference is that ROMP has a regularization stage. This stage is described by the authors as “Among all subsets with comparable coordinates: for all , choose with the maximal energy .”. Furthermore, the authors explain the regularization more specifically as “The regularization step of ROMP, i.e. selecting , can be done fast by observing that is an interval in the decreasing rearrangement of coefficients. Moreover, the analysis of the algorithm shows that instead of searching over all intervals , it suffices to look for among consecutive intervals with endpoints where the magnitude of coefficients decreases by a factor of 2”.
Here I can’t understand the above description well, and can’t implement properly.
7. Iterative Hard Thresholding (IHT) [7]
I implement this scheme according to the above formula. However, the recovery result is terrible, which is depicted in Fig. 6. The reason may be that the operator norm is not smaller than one, i.e. it doesn’t meet the requirement of IHT theoretically.
![[转载]CS <wbr>recovery <wbr>algorithms(OMP,ROMP,IRLS, <wbr>GBP,...) [转载]CS <wbr>recovery <wbr>algorithms(OMP,ROMP,IRLS, <wbr>GBP,...)](https://i-blog.csdnimg.cn/blog_migrate/8cde33f8698249e1b8e4919c16d73c15.jpeg)
Fig. 6. Recovery result by using IHT
8. Linear Programming (LP) [8]
Fig. 7 shows the recovery result by solving the standard BP problem using a primal-dual algorithm.
![[转载]CS <wbr>recovery <wbr>algorithms(OMP,ROMP,IRLS, <wbr>GBP,...) [转载]CS <wbr>recovery <wbr>algorithms(OMP,ROMP,IRLS, <wbr>GBP,...)](https://i-blog.csdnimg.cn/blog_migrate/ede658145d136fdcce55eaeeabeffd28.jpeg)
Fig. 7. Recovery result by using BP
Besides the solution of l1 minimization with equality constraint, the signal can also be recovered by solving the l1 minimization with quadratic constraint.
Fig. 8 illustrates the recovered results by solving the l1 minimization with quadratic constraint. As can be seen, the performance is better than the above.
![[转载]CS <wbr>recovery <wbr>algorithms(OMP,ROMP,IRLS, <wbr>GBP,...) [转载]CS <wbr>recovery <wbr>algorithms(OMP,ROMP,IRLS, <wbr>GBP,...)](https://i-blog.csdnimg.cn/blog_migrate/a62057f281b8f0049cfa927764459b20.jpeg)
Fig. 8. Recovery result by solving l1 minimization with quadratic constraint
All the above descriptions and experiments are based on my perception, if there is somewhere inappropriate, please contact with me <roy@mail.ustc.edu.cn>.
Reference
[1] J. Tropp and A. Gilbert, “Signal Recovery from Random Measurements via Orthogonal Matching Pursuit,” 2007.
[2] D. Deedell andJ. Tropp, “COSAMP: Iterative Signal Recovery from Incomplete and Inaccurate Samples,” 2008.
[3] P. Huggins and S. Zucker, “Greedy Basis Pursuit,” 2006.
[4] R. Chartrand and W. Yin, “Iteratively Reweighted Algorithms for Compressed Sensing,” 2008.
[5] W. Dai and O. Milenkovic, “Subspace Pursuit for Compressive Sensing Signal Reconstruction,” 2009.
[6] D. Needell and R. Vershynin, “Uniform Uncertainty Principle and Signal Recovery via Regularized Orthogonal Matching Pursuit,” 2007.
[7] T. Blumensath and M. Davies, “Iterative Hard Thresholding for Compressed Sensing,” 2008.
[8] E. Candes and J. Romberg, “l1-Magic: Recovery of Sparse Signals via Convex Programming,” 2005.