Eigenvalue Decomposition of Symmetric Matrices(convex)

本文深入探讨了对称矩阵的特征分解原理及其在优化、图形学和数据可视化领域的应用,特别是如何利用特征分解来识别二次函数的凸性、找到数据集内的最大方差方向,并作为主成分分析(PCA)的基础。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

Eigenvalue Decomposition of Symmetric Matrices


Symmetric matrices are square with elements that mirror each other across the diagonal. They can be used to describe for example graphs with undirected, weighted edges between the nodes; or distance matrices (between say cities), and a host of other applications. Symmetric matrices are also important in optimization, as they are closely related to quadratic functions.

A fundamental theorem, the spectral theorem, shows that we can decompose any symmetric matrix as a three-term product of matrices, involving anorthogonal transformation and a diagonal matrix. The theorem has a direct implication for quadratic functions: it allows a to decompose any quadratic function into a weighted sum of squared linear functions involving vectors that are mutually orthogonal. The weights are called the eigenvalues of the symmetric matrix.

The spectral theorem allows in particular to determine when a given quadratic function is ‘‘bowl-shaped’’, that is,convex. The spectral theorem also allows to find directions of maximal variance within a data set. Such directions are useful to visualize high-dimensional data points in two or three dimensions. This is the basis of a visualization method known as principal component analysis (PCA).

From: https://inst.eecs.berkeley.edu/~ee127a/book/login/l_sym_main.html


Spectral theorem

An important result of linear algebra, called the spectral theorem, or symmetric eigenvalue decomposition (SED) theorem, states that for any symmetric matrix, there are exactly n (possibly not distinct) eigenvalues, and they are all real; further, that the associated eigenvectors can be chosen so as to form an orthonormal basis. The result offers a simple way to decompose the symmetric matrix as a product of simple transformations.

Theorem: Symmetric eigenvalue decomposition

We can decompose any symmetric matrix A in mathbf{S}^n with the symmetric eigenvalue decomposition (SED) 
 A = sum_{i=1}^n lambda_i u_iu_i^T  = U Lambda U^T, ;; Lambda = mbox{bf diag}(lambda_1,ldots,lambda_n) .  
where the matrix of U := [u_1 , ldots, u_n] is orthogonal (that is, U^TU=UU^T = I_n), and contains the eigenvectors of A, while the diagonal matrix Lambda contains the eigenvalues of A.

Here is a proof. The SED provides a decomposition of the matrix in simple terms, namely dyads.

We check that in the SED above, the scalars lambda_i are the eigenvalues, and u_i’s are associated eigenvectors, since 
 Au_j = sum_{i=1}^n lambda_i u_iu_i^Tu_j = lambda_j u_j, ;; j=1,ldots,n.  

The eigenvalue decomposition of a symmetric matrix can be efficiently computed with standard software, in time that grows proportionately to its dimension n as n^3. Here is the matlab syntax, where the first line ensure that matlab knows that the matrix A is exactly symmetric.

From: https://inst.eecs.berkeley.edu/~ee127a/book/login/l_sym_sed.html
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值