学习 http://blog.youkuaiyun.com/tiandijun/article/details/41578175 后随笔
1,sparse representation
解决问题:当数据量增大后,线性表达的基求解十分复杂,而且很多事多余的,稀疏表达可以解决这个问题。
稀疏直观理解就是在满足误差小和非零项尽可能多,非零项就是解决零范数问题,但是约束太强,是非凸问题,松弛约束之后就是1范数,这个是凸优化问题,然后就继续
松弛,有了p范数的问题。
如果从一个很大的dictionary里选择,很有可能会有一个与输入极为相似,这也就很有可能有sparse solution [0,...,1,...0];
所以一个有过完备字典(over complete dictionary),稀疏解可以work
Limitations of sparse representation
改进
Nonlocally Centralized Sparse Representation
[1]W. Dong, L. Zhang and G. Shi, “Centralized Sparse Representation for ImageRestoration”, in ICCV 2011.
[2]W. Dong, L. Zhang, G. Shi and X.Li, “NonlocallyCentralized Sparse Representation for Image Restoration”,IEEE Trans. on ImageProcessing,vol. 22, no. 4, pp.1620-1630, April 2013.
NCSR: idea
NCSR: objective function
NCSR:solution
NCSR: The parameters and dictionaries
Gradient Histogram Preservation
[1] W. Zuo, L. Zhang, C. Song, and D. Zhang, “Texture Enhanced Image Denoisingvia
Gradient Histogram Preservation,” in CVPR 2013.
[2] W. Zuo, L. Zhang, C. Song, D. Zhang, and H.Gao, “GradientHistogram Estimation
and Preservation for Texture Enhanced Image Denoising,” in TIP 2014.
针对问题:
Like noise, textures are fine scale structures in images,and most of the denoising algorithms will remove the textures while removingnoise.
• Is it possibleto preserve the texture structures, to some extent, in denoising?
GHP:
Group sparsity
想法:
通常认为一些东西的基(basis)和另一些东西不同,所以可以按照sample or feature分group.
An observation that features or data items within a group are expected to share the same sparsity pattern in their latent factor representation.
Different types of features, such as in CV: pixel values, gradient features, 3D pose features, etc. 同种feature组成一个group
Rank decomposition
rank 描述的是矩阵的相关性属性,其中的每一个atomy因为位置和排列呈现出二维的秩序,所以rank decomposition 是从2D角度剖析相似度的structure.