矩阵分解学习---详细介绍SVD

奇异值分解(SVD)是线性代数中矩阵的一种分解方式,广泛应用于信号处理和统计中。它将矩阵分解为单位矩阵U、对角矩阵Σ和V的共轭转置的乘积。SVD的应用包括计算伪逆、最小二乘拟合、矩阵的秩、范围和零空间确定,以及低秩矩阵近似等。奇异值是椭圆或椭球的半轴长度,可以直观解释为旋转、缩放和再次旋转的过程。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

矩阵分解 (decomposition, factorization)是将矩阵拆解为数个矩阵的乘积,可分为三角分解、满秩分解、QR分解、Jordan分解和SVD(奇异值)分解等,常见的有三种:1)三角分解法 (Triangular Factorization),2)QR 分解法 (QR Factorization),3) 奇异值分解法 (Singular Value Decompostion)。
  (1) 三角分解法
  三角分解法是将原正方 (square) 矩阵分解成一个上三角形矩阵 或是排列(permuted) 的上三角形矩阵和一个 下三角形矩阵,这样的分解法又称为LU分解法。它的用途主要在简化一个大矩阵的行列式值的计算过程,求 反矩阵,和求解联立方程组。不过要注意这种分解法所得到的上下三角形矩阵并非唯一,还可找到数个不同 的一对上下三角形矩阵,此两三角形矩阵相乘也会得到原矩阵。
  MATLAB以lu函数来执行lu分解法, 其语法为[L,U]=lu(A)。
  (2) QR分解法
  QR分解法是将矩阵分解成一个正规正交矩阵与上三角形矩阵,所以称为QR分解法,与此正规正交矩阵的通用符号Q有关。
  MATLAB以qr函数来执行QR分解法, 其语法为[Q,R]=qr(A)。
  (3) 奇异值分解法
  奇异值分解 (singular value decomposition,SVD) 是另一种正交矩阵分解法;SVD是最可靠的分解法,但是它比QR 分解法要花上近十倍的计算时间。[U,S,V]=svd(A),其中U和V代表二个相互正交矩阵,而S代表一 对角矩阵。 和QR分解法相同者, 原矩阵A不必为正方矩阵。使用SVD分解法的用途是解最小平方误差法和数据压缩。

  MATLAB以svd函数来执行svd分解法, 其语法为[S,V,D]=svd(A)。

Singular value decomposition

From Wikipedia, the free encyclopedia
Visualization of the SVD of a 2-dimensional, real shearing matrix M. First, we see the unit disc in blue together with the two canonical unit vectors. We then see the action of M, which distorts the disk to an ellipse. The SVD decomposes M into three simple transformations: a rotation V *, a scaling Σ along the rotated coordinate axes and a second rotation U. The lengths σ 1 and σ 2 of the semi-axes of the ellipse are the singular values of M.

In linear algebra, the singular value decomposition (SVD) is a factorization of a real or complex matrix, with many useful applications in signal processing and statistics.

Formally, the singular value decomposition of an m×n real or complex matrix M is a factorization of the form

M = U\Sigma V^*, \,

where U is an m×m real or complex unitary matrix, Σ is an m×n rectangular diagonal matrix with nonnegative real numbers on the diagonal, and V* (the conjugate transpose of V) is an n×n real or complex unitary matrix. The diagonal entries Σi,i of Σ are known as the singular values of M. The m columns of U and the n columns of V are called the left singular vectors and right singular vectors of M, respectively.

The singular value decomposition and the eigendecomposition are closely related. Namely:

  • The left singular vectors of M are eigenvectors of MM^{*}.
  • The right singular vectors of M are eigenvectors of M^{*}M.
  • The non-zero singular values of M (found on the diagonal entries of Σ) are the square roots of the non-zero eigenvalues of both M^{*}M and MM^{*}.

Applications which employ the SVD include computing the pseudoinverse, least squares fitting of data, matrix approximation, and determining the rank, range and null space of a matrix.

[edit] Statement of the theorem

Suppose M is an m×n matrix whose entries come from the field K, which is either the field of real numbers or the field of complex numbers. Then there exists a factorization of the form

M = U\Sigma V^*, \,

where U is an m×m unitary matrix over K, the matrix Σ is an m×n diagonal matrix with nonnegative real numbers on the diagonal, and the n×n unitary matrix V* denotes the conjugate transpose of V. Such a factorization is called the singular value decomposition of M.

The diagonal entries \sigma_i of Σ are known as the singular values of M. A common convention is to list the singular values in descending order. In this case, the diagonal matrix Σ is uniquely determined by M (though the matrices U and V are not).

[edit] Intuitive interpretations

[edit] Rotation, scaling, rotation

In the special but common case in which M is just an m×m square matrix with positive determinant whose entries are plain real numbers, then U, V*, and Σ are m×m matrices of real numbers as well, Σ can be regarded as a scaling matrix, and U and V* can be viewed as rotation matrices.

If the above mentioned conditions are met, the expression U\Sigma V^* \, can thus be intuitively interpreted as a composition (or sequence) of three geometrical transformations: a rotation, a scaling, and another rotation. For instance, the figure above explains how a shear matrix can be described as such a sequence.

[edit] Singular values as semiaxis of an ellipse or ellipsoid

As shown in the figure, the singular values can be interpreted as the semiaxes of an ellipse in 2-D. This concept can be generalized to n-dimensional Euclidean space, with the singular values of any n×n square matrix being viewed as the semiaxes of an n-dimensional ellipsoid. See below for further details.

[edit] U and V are orthonormal bases

Since U and V* are unitary, the columns of each of them form a set of orthonormal vectors, which can be regarded as basis vectors. By the definition of unitary matrix, the same is true for their conjugate transposes U* and V. In short, U, U*, V, and V* are orthonormal bases.

[edit] Example

Consider the 4×5 matrix

M = 
\begin{bmatrix}
1 & 0 & 0 & 0 & 2\\
0 & 0 & 3 & 0 & 0\\
0 & 0 & 0 & 0 & 0\\
0 & 4 & 0 & 0 & 0\end{bmatrix}

A singular value decomposition of this matrix is given by U \Sigma V^*

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值