线性代数 · SVD | 导数

注:本文为 “线性代数 · SVD | 导数” 相关英文引文,机翻未校。


Singular value decomposition derivatives

奇异值分解的导数

Published Nov 10, 2023

The singular value decomposition (SVD) is a matrix decomposition that is used in many applications. It is defined as:
奇异值分解(Singular Value Decomposition, SVD)是一种在众多领域中均有应用的矩阵分解方法,其定义如下:

J = U Σ V T \begin{align*} J &= U \Sigma V^T \end{align*} J=UΣVT

where U U U and V V V are orthogonal matrices and Σ \Sigma Σ is a diagonal matrix with non-negative entries. The diagonal entries of Σ \Sigma Σ are called the singular values of J J J and are denoted as σ 1 , … , σ n \sigma_1,\dots,\sigma_n σ1,,σn. The singular values are the square roots of the eigenvalues of J T J J^TJ JTJ. In this post we’re going to go over how to differentiate the elements of the SVD under the assumption that the singular values are all distinct and non-zero.
其中, U U U V V V 为正交矩阵, Σ \Sigma Σ 为对角元非负的对角矩阵。 Σ \Sigma Σ 的对角元被称为矩阵 J J J 的奇异值,记为 σ 1 , … , σ n \sigma_1,\dots,\sigma_n σ1,,σn。奇异值是矩阵 J T J J^TJ JTJ 特征值的平方根。本文将在“所有奇异值互不相同且非零”的假设下,详细推导奇异值分解各元素的导数。

Einstein notation

爱因斯坦求和符号

Before proceeding, we need to understand Einstein notation. Einstein notation is an alternative way of writing matrix equations where we undo the matrix notation and remove the summation symbol. For example, lets write the equation A x = b Ax = b Ax=b in Einstein notation.
在进行后续推导前,需先理解爱因斯坦求和符号(Einstein notation)。该符号是矩阵方程的一种替代表示方法,核心是“拆解矩阵形式并省略求和符号”。以方程 A x = b Ax = b Ax=b 为例,其爱因斯坦求和符号表示过程如下:

  • Step 1: Undo the matrix notation 拆解矩阵形式
    A x = b → ∑ j J i j x j = b i Ax = b \to \sum_j J_{ij}x_j = b_i Ax=bjJijxj=bi

  • Step 2: Remove the summation symbol 省略求和符号
    ∑ j J i j x j = b i → J i j x j = b i \sum_j J_{ij}x_j = b_i \to J_{ij}x_j = b_i jJijxj=biJijxj=bi

And thats it! All we did was remove the summation symbol. When we see Einstein notation in practice, we implicitly assume that there is a summation over indices that only appear on one side of the equality. Also in Einstein notation, we will make use of the Kronecker delta function δ i j \delta_{ij} δij which is 1 1 1 when i = j i=j i=j and 0 0 0 otherwise.
推导至此即可!整个过程的核心就是省略求和符号。在实际使用爱因斯坦求和符号时,我们默认:对“仅在等式一侧出现的指标”进行求和(即“哑标求和”规则)。此外,爱因斯坦求和符号中还会用到克罗内克δ函数(Kronecker delta function) δ i j \delta_{ij} δij,其定义为:当 i = j i=j i=j 时, δ i j = 1 \delta_{ij}=1 δij=1;当 i ≠ j i \neq j i=j 时, δ i j = 0 \delta_{ij}=0 δij=0

Orthogonal matrices

正交矩阵

Next, we need to know how to differentiate orthogonal matrices. Let Q Q Q be an orthogonal matrix, then by definition Q k i Q k j = δ i j Q_{ki}Q_{kj} = \delta_{ij} QkiQkj=δij. Taking the derivative yields:
接下来,需推导正交矩阵的导数。设 Q Q Q 为正交矩阵,根据正交矩阵的定义,有 Q k i Q k j = δ i j Q_{ki}Q_{kj} = \delta_{ij} QkiQkj=δij。对等式两侧求导可得:

∂ ( Q k i Q k j ) = ∂ δ i j    ⟹    ∂ Q k i Q k j + Q k i ∂ Q k j = 0    ⟹    ∂ Q k i Q k j = − Q k i ∂ Q k j \begin{align*} \partial (Q_{ki}Q_{kj}) &= \partial \delta_{ij} \\ \implies \partial Q_{ki} Q_{kj} + Q_{ki} \partial Q_{kj} &= 0 \\ \implies \partial Q_{ki} Q_{kj} &= -Q_{ki} \partial Q_{kj} \\ \end{align*} (QkiQkj)QkiQkj+QkiQkjQkiQkj=δij=0=QkiQkj

To make this equations clearer, we can undo some of the Einstein notation by letting q i : = Q : , i q_i:= Q_{:,i} qi:=Q:,i be the i i ith column of Q Q Q. Then we have:
为使等式更清晰,可部分还原爱因斯坦求和符号的矩阵形式:令 q i : = Q : , i q_i := Q_{:,i} qi:=Q:,i(即 q i q_i qi 为矩阵 Q Q Q 的第 i i i 列),此时等式可改写为:

∂ q i ⋅ q j = − ∂ q j ⋅ q i \begin{align*} \partial q_i \cdot q_j = -\partial q_j \cdot q_i \end{align*} qiqj=qjqi

Note that when i = j i=j i=j, ∂ q i ⋅ q i = 0 \partial q_i \cdot q_i = 0 qiqi=0. This will be useful when we differentiate the SVD.
需注意,当 i = j i=j i=j 时,有 ∂ q i ⋅ q i = 0 \partial q_i \cdot q_i = 0 qiqi=0。该结论在后续奇异值分解的求导过程中会发挥重要作用。

Singular value decomposition derivatives

奇异值分解的导数

Lets start by writing the SVD using Einstein notation
首先,用爱因斯坦求和符号表示奇异值分解:

J i j = U i u σ u V j u \begin{align*} J_{ij} &= U_{iu} \sigma_u V_{ju} \\ \end{align*} Jij=UiuσuVju

with some term rearrangement, we can write two equations:
通过调整项的顺序,可得到以下两个等式:

J i j U i u = σ u V j u J i j V j u = σ u U i u \begin{align*} J_{ij}U_{iu} &= \sigma_u V_{ju} \\ J_{ij}V_{ju} &= \sigma_u U_{iu} \end{align*} JijUiuJijVju=σuVju=σuUiu

by applying a derivative, we get
对上述两个等式两侧分别求导,可得:

∂ J i j U i u + J i j ∂ U i u = ∂ σ u V j u + σ u ∂ V j u ∂ J i j V j u + J i j ∂ V j u = ∂ σ u U i u + σ u ∂ U i u \begin{align*} \partial J_{ij}U_{iu} + J_{ij}\partial U_{iu} &= \partial \sigma_u V_{ju} + \sigma_u \partial V_{ju} \\ \partial J_{ij}V_{ju} + J_{ij}\partial V_{ju} &= \partial \sigma_u U_{iu} + \sigma_u \partial U_{iu} \end{align*} JijUiu+JijUiuJijVju+JijVju=σuVju+σuVju=σuUiu+σuUiu

We’ll call the first equation, equation 1 and the second equation, equation 2.
我们将第一个等式称为“等式 1”,第二个等式称为“等式 2”。

Singular value derivaties

奇异值的导数

To get the derivatives of the singular values, we can multiply both sides of equation 2 by U i u U_{iu} Uiu and summing over i i i:
为推导奇异值的导数,可将等式 2 两侧同时乘以 U i u U_{iu} Uiu,并对指标 i i i 求和:

∂ J i j V j u U i u + J i j ∂ V j u U i u ⏟ σ u V j u ∂ V j u = 0 = ∂ σ u U i u U i u ⏟ 1 + σ u ∂ U i u U i u ⏟ 0    ⟹    ∂ σ u = ∂ J i j U i u V j u \begin{align*} \partial J_{ij}V_{ju} U_{iu} + \underbrace{J_{ij}\partial V_{ju} U_{iu}}_{\sigma_u V_{ju} \partial V_{ju}=0} &= \partial \sigma_u \underbrace{U_{iu} U_{iu}}_{1} + \sigma_u \underbrace{\partial U_{iu} U_{iu}}_{0} \\ \implies \partial \sigma_u &= \partial J_{ij}U_{iu} V_{ju} \end{align*} JijVjuUiu+σuVjuVju=0 JijVjuUiuσu=σu1 UiuUiu+σu0 UiuUiu=JijUiuVju

注:推导中用到两个结论:

  1. J i j V j u = σ u U i u J_{ij}V_{ju} = \sigma_u U_{iu} JijVju=σuUiu J i j ∂ V j u U i u = σ u V j u ∂ V j u J_{ij}\partial V_{ju} U_{iu} = \sigma_u V_{ju} \partial V_{ju} JijVjuUiu=σuVjuVju,再结合正交矩阵求导结论 ∂ V j u V j u = 0 \partial V_{ju} V_{ju}=0 VjuVju=0,故该项为 0;
  2. 正交矩阵列向量单位性: U i u U i u = 1 U_{iu} U_{iu}=1 UiuUiu=1,且 ∂ U i u U i u = 0 \partial U_{iu} U_{iu}=0 UiuUiu=0

Singular vector derivatives

奇异向量的导数

Next, to isolate the derivatives of the singular vecotrs, we’ll first multiply both sides of equation 1 by V j v V_{jv} Vjv, where v ≠ u v \neq u v=u and sum over j j j:
接下来推导奇异向量的导数。首先,将等式 1 两侧同时乘以 V j v V_{jv} Vjv(其中 v ≠ u v \neq u v=u),并对指标 j j j 求和:

∂ J i j U i u V j v + J i j ∂ U i u V j v ⏟ σ v ∂ U i u U i v = ∂ σ u V j u V j v ⏟ 0 + σ u ∂ V j u V j v ∂ J i j U i u V j v = − σ v ∂ U i u U i v + σ u ∂ V j u V j v \begin{align*} \partial J_{ij}U_{iu} V_{jv} + \underbrace{J_{ij}\partial U_{iu} V_{jv}}_{\sigma_v \partial U_{iu} U_{iv}} &= \partial \sigma_u \underbrace{V_{ju} V_{jv}}_{0} + \sigma_u \partial V_{ju} V_{jv} \\ \partial J_{ij}U_{iu} V_{jv} &= -\sigma_v \partial U_{iu} U_{iv} + \sigma_u \partial V_{ju} V_{jv} \end{align*} JijUiuVjv+σvUiuUiv JijUiuVjvJijUiuVjv=σu0 VjuVjv+σuVjuVjv=σvUiuUiv+σuVjuVjv

注:

  1. J i j V j v = σ v U i v J_{ij}V_{jv} = \sigma_v U_{iv} JijVjv=σvUiv J i j ∂ U i u V j v = σ v ∂ U i u U i v J_{ij}\partial U_{iu} V_{jv} = \sigma_v \partial U_{iu} U_{iv} JijUiuVjv=σvUiuUiv
  2. 正交矩阵列向量正交性: V j u V j v = 0 V_{ju} V_{jv}=0 VjuVjv=0(因 v ≠ u v \neq u v=u))

Similarly we can do the same with equation 2 but multiply by U i v U_{iv} Uiv where v ≠ u v\neq u v=u and sum over i i i:
类似地,对等式 2 进行操作:将其两侧同时乘以 U i v U_{iv} Uiv(其中 v ≠ u v \neq u v=u),并对指标 i i i 求和:

∂ J i j V j u U i v + J i j ∂ V j u U i v ⏟ σ v ∂ V j u V j v = ∂ σ u U i u U i v ⏟ 0 + σ u ∂ U i u U i v ∂ J i j U i v V j u = σ u ∂ U i u U i v − σ v ∂ V j u V j v \begin{align*} \partial J_{ij}V_{ju} U_{iv} + \underbrace{J_{ij}\partial V_{ju} U_{iv}}_{\sigma_v \partial V_{ju} V_{jv}} &= \partial \sigma_u \underbrace{U_{iu} U_{iv}}_{0} + \sigma_u \partial U_{iu} U_{iv} \\ \partial J_{ij}U_{iv} V_{ju} &= \sigma_u \partial U_{iu} U_{iv} - \sigma_v \partial V_{ju} V_{jv} \end{align*} JijVjuUiv+σvVjuVjv JijVjuUivJijUivVju=σu0 UiuUiv+σuUiuUiv=σuUiuUivσvVjuVjv

注:

  1. J i j U i v = σ v V j v J_{ij}U_{iv} = \sigma_v V_{jv} JijUiv=σvVjv J i j ∂ V j u U i v = σ v ∂ V j u V j v J_{ij}\partial V_{ju} U_{iv} = \sigma_v \partial V_{ju} V_{jv} JijVjuUiv=σvVjuVjv
  2. 正交矩阵列向量正交性: U i u U i v = 0 U_{iu} U_{iv}=0 UiuUiv=0(因 v ≠ u v \neq u v=u))

So we’re left with the equation
综上,可得到如下方程组:

∂ J i j U i u V j v = − σ v ∂ U i u U i v + σ u ∂ V j u V j v ∂ J i j U i v V j u = σ u ∂ U i u U i v − σ v ∂ V j u V j v \begin{align*} \partial J_{ij}U_{iu} V_{jv} &= -\sigma_v \partial U_{iu} U_{iv} + \sigma_u \partial V_{ju} V_{jv} \\ \partial J_{ij}U_{iv} V_{ju} &= \sigma_u \partial U_{iu} U_{iv} - \sigma_v \partial V_{ju} V_{jv} \end{align*} JijUiuVjvJijUivVju=σvUiuUiv+σuVjuVjv=σuUiuUivσvVjuVjv

Left singular vectors
左奇异向量

Lets multiply the above equations by σ v \sigma_v σv and σ u \sigma_u σu respectively:
将上述方程组的第一个等式乘以 σ v \sigma_v σv,第二个等式乘以 σ u \sigma_u σu,可得:

σ v ∂ J i j U i u V j v = − σ v 2 ∂ U i u U i v + σ v σ u ∂ V j u V j v σ u ∂ J i j U i v V j u = σ u 2 ∂ U i u U i v − σ v σ u ∂ V j u V j v \begin{align*} \sigma_v \partial J_{ij}U_{iu} V_{jv} &= -{\sigma_v}^2 \partial U_{iu} U_{iv} + \sigma_v \sigma_u \partial V_{ju} V_{jv} \\ \sigma_u \partial J_{ij}U_{iv} V_{ju} &= {\sigma_u}^2 \partial U_{iu} U_{iv} - \sigma_v \sigma_u \partial V_{ju} V_{jv} \end{align*} σvJijUiuVjvσuJijUivVju=σv2UiuUiv+σvσuVjuVjv=σu2UiuUivσvσuVjuVjv

If we sum the equations, the last terms cancel and we’re left with
将两个等式相加,右侧的交叉项( σ v σ u ∂ V j u V j v \sigma_v \sigma_u \partial V_{ju} V_{jv} σvσuVjuVjv − σ v σ u ∂ V j u V j v -\sigma_v \sigma_u \partial V_{ju} V_{jv} σvσuVjuVjv)相互抵消,最终得到:

∂ J i j ( σ u U i v V j u + σ v U i u V j v ) = ( σ u 2 − σ v 2 ) ∂ U i u U i v    ⟹    ∂ U i u U i v = 1 σ u 2 − σ v 2 ∂ J i j ( σ u U i v V j u + σ v U i u V j v ) \begin{align*} \partial J_{ij}\left(\sigma_u U_{iv} V_{ju} + \sigma_v U_{iu} V_{jv}\right) &= ({\sigma_u}^2 - {\sigma_v}^2) \partial U_{iu} U_{iv} \\ \implies \partial U_{iu} U_{iv} &= \frac{1}{\sigma_u^2 - \sigma_v^2} \partial J_{ij}\left(\sigma_u U_{iv} V_{ju} + \sigma_v U_{iu} V_{jv}\right) \end{align*} Jij(σuUivVju+σvUiuVjv)UiuUiv=(σu2σv2)UiuUiv=σu2σv21Jij(σuUivVju+σvUiuVjv)

Right singular vectors
右奇异向量

Similarly, if we multiplied by σ u \sigma_u σu and σ v \sigma_v σv respectively, we get
类似地,将原方程组的第一个等式乘以 σ u \sigma_u σu,第二个等式乘以 σ v \sigma_v σv,可得:

σ u ∂ J i j U i u V j v = − σ v σ u ∂ U i u U i v + σ u 2 ∂ V j u V j v σ v ∂ J i j U i v V j u = σ v σ u ∂ U i u U i v − σ v 2 ∂ V j u V j v \begin{align*} \sigma_u \partial J_{ij}U_{iu} V_{jv} &= -{\sigma_v} \sigma_u \partial U_{iu} U_{iv} + {\sigma_u}^2 \partial V_{ju} V_{jv} \\ \sigma_v\partial J_{ij}U_{iv} V_{ju} &= \sigma_v\sigma_u \partial U_{iu} U_{iv} - {\sigma_v}^2 \partial V_{ju} V_{jv} \end{align*} σuJijUiuVjvσvJijUivVju=σvσuUiuUiv+σu2VjuVjv=σvσuUiuUivσv2VjuVjv

If we sum the equations, the first terms on the RHS cancel and we’re left with
将两个等式相加,右侧的交叉项( − σ v σ u ∂ U i u U i v -\sigma_v \sigma_u \partial U_{iu} U_{iv} σvσuUiuUiv σ v σ u ∂ U i u U i v \sigma_v \sigma_u \partial U_{iu} U_{iv} σvσuUiuUiv)相互抵消,最终得到:

∂ J i j ( σ v U i v V j u + σ u U i u V j v ) = ( σ u 2 − σ v 2 ) ∂ V j u V j v    ⟹    ∂ V j u V j v = 1 σ u 2 − σ v 2 ∂ J i j ( σ v U i v V j u + σ u U i u V j v ) \begin{align*} \partial J_{ij}\left(\sigma_v U_{iv} V_{ju} + \sigma_u U_{iu} V_{jv}\right) &= ({\sigma_u}^2 - {\sigma_v}^2) \partial V_{ju} V_{jv} \\ \implies \partial V_{ju} V_{jv} &= \frac{1}{\sigma_u^2 - \sigma_v^2} \partial J_{ij}\left(\sigma_v U_{iv} V_{ju} + \sigma_u U_{iu} V_{jv}\right) \end{align*} Jij(σvUivVju+σuUiuVjv)VjuVjv=(σu2σv2)VjuVjv=σu2σv21Jij(σvUivVju+σuUiuVjv)

Summary

总结

To simplify the expressions, we’ll use the notation U i : = U : , i U_i := U_{:,i} Ui:=U:,i and V i : = V : , i V_i := V_{:,i} Vi:=V:,i to denote the i i ith column of U U U and V V V respectively. Then returning to matrix notation yields:
为简化表达式,定义符号 U i : = U : , i U_i := U_{:,i} Ui:=U:,i U i U_i Ui 为矩阵 U U U 的第 i i i 列)和 V i : = V : , i V_i := V_{:,i} Vi:=V:,i V i V_i Vi 为矩阵 V V V 的第 i i i 列),并还原为矩阵形式,可得:

∂ σ u = ∂ J i j U i u V j u ∂ U u ⋅ U v ≠ u = 1 σ u 2 − σ v 2 ∂ J i j ( σ u U i v V j u + σ v U i u V j v ) ∂ V u ⋅ V v = 1 σ u 2 − σ v 2 ∂ J i j ( σ v U i v V j u + σ u U i u V j v ) \begin{align*} \partial \sigma_u &= \partial J_{ij}U_{iu} V_{ju} \\ \partial U_u \cdot U_{v\neq u} &= \frac{1}{\sigma_u^2 - \sigma_v^2} \partial J_{ij}\left(\sigma_u U_{iv} V_{ju} + \sigma_v U_{iu} V_{jv}\right) \\ \partial V_u \cdot V_v &= \frac{1}{\sigma_u^2 - \sigma_v^2} \partial J_{ij}\left(\sigma_v U_{iv} V_{ju} + \sigma_u U_{iu} V_{jv}\right) \end{align*} σuUuUv=uVuVv=JijUiuVju=σu2σv21Jij(σuUivVju+σvUiuVjv)=σu2σv21Jij(σvUivVju+σuUiuVjv)

Note that to isolate the derivatives of U U U and V V V, we can write them as a linear combination of the singular vectors:
需注意,若要单独表示 U U U V V V 的导数,可将其表示为奇异向量的线性组合:

∂ U u = ( ∂ U u ⋅ U v ≠ u ) U u ∂ V u = ( ∂ V u ⋅ V v ≠ u ) V u \begin{align*} \partial U_u = (\partial U_u \cdot U_{v\neq u}) U_u \\ \partial V_u = (\partial V_u \cdot V_{v\neq u}) V_u \end{align*} Uu=(UuUv=u)UuVu=(VuVv=u)Vu

Because U U U and V V V are orthogonal, ∂ U u ⋅ U u = ∂ V u ⋅ V u = 0 \partial U_u \cdot U_u = \partial V_u \cdot V_u = 0 UuUu=VuVu=0.
这是因为 U U U V V V 均为正交矩阵,根据正交矩阵求导结论,有 ∂ U u ⋅ U u = ∂ V u ⋅ V u = 0 \partial U_u \cdot U_u = \partial V_u \cdot V_u = 0 UuUu=VuVu=0(即奇异向量的导数与其自身正交)。

Time derivative

时间导数

We can also see how the singular vectors and singular values evolve when we flow on the vector field:
当矩阵 J J J 随向量场 d x t d t = X t ( x t ) \frac{dx_t}{dt} = X_t(x_t) dtdxt=Xt(xt) 演化时,我们还可推导奇异向量和奇异值的时间导数:

d x t d t = X t ( x t ) \begin{align*} \frac{dx_t}{dt} = X_t(x_t) \end{align*} dtdxt=Xt(xt)

To do this, recall that we can write the time derivative of the components of J J J as:
首先,回顾矩阵 J J J 各元素的时间导数表达式:

d J d t = ∇ X t J \begin{align*} \frac{dJ}{dt} = \nabla X_t J \end{align*} dtdJ=XtJ

Then we can look at the time derivative of the SVD derivatives.
基于此,可进一步推导奇异值分解各元素的时间导数。

Singular value derivatives

奇异值的时间导数

d σ u d t = d J i j d t U i u V j u = ( ∇ X t ) i k J k j U i u V j u = ( ∇ X t ) i k U i u σ u U k u \begin{align*} \frac{d\sigma_u}{dt} &= \frac{dJ_{ij}}{dt}U_{iu} V_{ju} \\ &= (\nabla X_t)_{ik} J_{kj} U_{iu} V_{ju} \\ &= (\nabla X_t)_{ik} U_{iu} \sigma_u U_{ku} \\ \end{align*} dtdσu=dtdJijUiuVju=(Xt)ikJkjUiuVju=(Xt)ikUiuσuUku

This is more simply expressed using the log of the singular values:
若对奇异值取对数,表达式可进一步简化:

d log ⁡ σ u d t = ( ∇ X t ) i k U i u U k u \begin{align*} \frac{d\log \sigma_u}{dt} = (\nabla X_t)_{ik} U_{iu} U_{ku} \end{align*} dtdlogσu=(Xt)ikUiuUku

注:对 d σ u d t = ( ∇ X t ) i k U i u σ u U k u \frac{d\sigma_u}{dt} = (\nabla X_t)_{ik} U_{iu} \sigma_u U_{ku} dtdσu=(Xt)ikUiuσuUku 两侧同时除以 σ u \sigma_u σu,利用 1 σ u d σ u d t = d log ⁡ σ u d t \frac{1}{\sigma_u}\frac{d\sigma_u}{dt} = \frac{d\log \sigma_u}{dt} σu1dtdσu=dtdlogσu,即可得到上述对数形式的时间导数,该形式在分析奇异值的相对变化率时更便捷。)

Left singular vector derivatives

左奇异向量的时间导数

d U u d t ⋅ U v ≠ u = 1 σ u 2 − σ v 2 d J i j d t ( σ u U i v V j u + σ v U i u V j v ) = 1 σ u 2 − σ v 2 ( ∇ X t ) i k J k j ( σ u U i v V j u + σ v U i u V j v ) = 1 σ u 2 − σ v 2 ( ∇ X t ) i k ( σ u 2 U i v U k u + σ v 2 U i u U k v ) = σ u 2 σ u 2 − σ v 2 U v T ( ∇ X t ) U u + σ v 2 σ u 2 − σ v 2 U u T ( ∇ X t ) U v = U v T ( σ u 2 σ u 2 − σ v 2 ∇ X t + σ v 2 σ u 2 − σ v 2 ∇ X t T ) U u \begin{align*} \frac{dU_u}{dt} \cdot U_{v\neq u} &= \frac{1}{\sigma_u^2 - \sigma_v^2} \frac{dJ_{ij}}{dt}\left(\sigma_u U_{iv} V_{ju} + \sigma_v U_{iu} V_{jv}\right) \\ &= \frac{1}{\sigma_u^2 - \sigma_v^2} (\nabla X_t)_{ik} J_{kj}\left(\sigma_u U_{iv} V_{ju} + \sigma_v U_{iu} V_{jv}\right) \\ &= \frac{1}{\sigma_u^2 - \sigma_v^2} (\nabla X_t)_{ik}\left(\sigma_u^2 U_{iv} U_{ku} + \sigma_v^2 U_{iu} U_{kv}\right) \\ &= \frac{\sigma_u^2}{\sigma_u^2 - \sigma_v^2} U_v^T(\nabla X_t)U_u + \frac{\sigma_v^2}{\sigma_u^2 - \sigma_v^2} U_u^T(\nabla X_t)U_v \\ &= U_v^T\left(\frac{\sigma_u^2}{\sigma_u^2 - \sigma_v^2}\nabla X_t + \frac{\sigma_v^2}{\sigma_u^2 - \sigma_v^2} \nabla X_t^T\right)U_u \end{align*} dtdUuUv=u=σu2σv21dtdJij(σuUivVju+σvUiuVjv)=σu2σv21(Xt)ikJkj(σuUivVju+σvUiuVjv)=σu2σv21(Xt)ik(σu2UivUku+σv2UiuUkv)=σu2σv2σu2UvT(Xt)Uu+σu2σv2σv2UuT(Xt)Uv=UvT(σu2σv2σu2Xt+σu2σv2σv2XtT)Uu

注:

  1. 将“奇异向量导数公式”中的 ∂ \partial 替换为时间导数 d d t \frac{d}{dt} dtd,并代入 d J i j d t = ( ∇ X t ) i k J k j \frac{dJ_{ij}}{dt} = (\nabla X_t)_{ik} J_{kj} dtdJij=(Xt)ikJkj
  2. 利用 J k j U i v V j u = σ u U k v U i v J_{kj} U_{iv} V_{ju} = \sigma_u U_{kv} U_{iv} JkjUivVju=σuUkvUiv J k j U i u V j v = σ v U k v U i u J_{kj} U_{iu} V_{jv} = \sigma_v U_{kv} U_{iu} JkjUiuVjv=σvUkvUiu(由 SVD 定义推导);
  3. 后两步将爱因斯坦求和符号转化为矩阵乘法形式,其中 U v T ( ∇ X t ) U u U_v^T(\nabla X_t)U_u UvT(Xt)Uu 对应 ( ∇ X t ) i k U i v U k u (\nabla X_t)_{ik} U_{iv} U_{ku} (Xt)ikUivUku U u T ( ∇ X t ) U v U_u^T(\nabla X_t)U_v UuT(Xt)Uv 对应 ( ∇ X t ) i k U i u U k v (\nabla X_t)_{ik} U_{iu} U_{kv} (Xt)ikUiuUkv,并通过合并同类项整理得到最终结果。

Right singular vector derivatives

右奇异向量的时间导数

d V u d t ⋅ V v ≠ u = 1 σ u 2 − σ v 2 d J i j d t ( σ v U i v V j u + σ u U i u V j v ) = 1 σ u 2 − σ v 2 ( ∇ X t ) i k J k j ( σ v U i v V j u + σ u U i u V j v ) = 1 σ u 2 − σ v 2 ( ∇ X t ) i k ( σ u σ v U i v U k u + σ u σ v U i u U k v ) = σ u σ v σ u 2 − σ v 2 ( U v T ( ∇ X t ) U u + U u T ( ∇ X t ) U v ) = σ u σ v σ u 2 − σ v 2 U v T ( ∇ X t + ∇ X t T ) U u \begin{align*} \frac{dV_u}{dt} \cdot V_{v\neq u} &= \frac{1}{\sigma_u^2 - \sigma_v^2} \frac{dJ_{ij}}{dt}\left(\sigma_v U_{iv} V_{ju} + \sigma_u U_{iu} V_{jv}\right) \\ &= \frac{1}{\sigma_u^2 - \sigma_v^2} (\nabla X_t)_{ik} J_{kj}\left(\sigma_v U_{iv} V_{ju} + \sigma_u U_{iu} V_{jv}\right) \\ &= \frac{1}{\sigma_u^2 - \sigma_v^2} (\nabla X_t)_{ik}\left(\sigma_u\sigma_v U_{iv} U_{ku} + \sigma_u\sigma_v U_{iu} U_{kv}\right) \\ &= \frac{\sigma_u \sigma_v}{\sigma_u^2 - \sigma_v^2}\left( U_v^T(\nabla X_t)U_u + U_u^T(\nabla X_t)U_v \right) \\ &= \frac{\sigma_u \sigma_v}{\sigma_u^2 - \sigma_v^2} U_v^T(\nabla X_t + \nabla X_t^T)U_u \end{align*} dtdVuVv=u=σu2σv21dtdJij(σvUivVju+σuUiuVjv)=σu2σv21(Xt)ikJkj(σvUivVju+σuUiuVjv)=σu2σv21(Xt)ik(σuσvUivUku+σuσvUiuUkv)=σu2σv2σuσv(UvT(Xt)Uu+UuT(Xt)Uv)=σu2σv2σuσvUvT(Xt+XtT)Uu

注:

  1. 与左奇异向量时间导数类似,第一步替换导数符号并代入 d J d t \frac{dJ}{dt} dtdJ 的表达式;

  2. 第二步利用 J k j U i v V j u = σ u U k v V j u J_{kj} U_{iv} V_{ju} = \sigma_u U_{kv} V_{ju} JkjUivVju=σuUkvVju J k j U i u V j v = σ v U k v V j v J_{kj} U_{iu} V_{jv} = \sigma_v U_{kv} V_{jv} JkjUiuVjv=σvUkvVjv,进一步结合 V j u = 1 σ u J k j U k u V_{ju} = \frac{1}{\sigma_u} J_{kj} U_{ku} Vju=σu1JkjUku(由 SVD 定义 J U = Σ V T J U = \Sigma V^T JU=ΣVT 变形),可化简得到 σ v U i v U k u + σ u U i u U k v \sigma_v U_{iv} U_{ku} + \sigma_u U_{iu} U_{kv} σvUivUku+σuUiuUkv,再乘以 σ u σ v \sigma_u \sigma_v σuσv 的系数;

  3. 最后一步将两项合并为 U v T ( ∇ X t + ∇ X t T ) U u U_v^T(\nabla X_t + \nabla X_t^T)U_u UvT(Xt+XtT)Uu,利用了矩阵乘法的转置性质: U u T ( ∇ X t ) U v = ( U v T ( ∇ X t T ) U u ) T U_u^T(\nabla X_t)U_v = \left( U_v^T(\nabla X_t^T)U_u \right)^T UuT(Xt)Uv=(UvT(XtT)Uu)T,而由于内积结果为标量,标量的转置等于其自身,故可合并为和的形式。


via:

提供了一个基于51单片机的RFID门禁系统的完整资源文件,包括PCB图、原理图、论文以及源程序。该系统设计由单片机、RFID-RC522频射卡模块、LCD显示、灯控电路、蜂鸣器报警电路、存储模块和按键组成。系统支持通过密码和刷卡两种方式进行门禁控制,灯亮表示开门成功,蜂鸣器响表示开门失败。 资源内容 PCB图:包含系统的PCB设计图,方便用户进行硬件电路的制作和调试。 原理图:详细展示了系统的电路连接和模块布局,帮助用户理解系统的工作原理。 论文:提供了系统的详细设计思路、实现方法以及测试结果,适合学习和研究使用。 源程序:包含系统的全部源代码,用户可以根据需要进行修改和优化。 系统功能 刷卡开门:用户可以通过刷RFID卡进行门禁控制,系统会自动识别卡片并判断是否允许开门。 密码开门:用户可以通过输入预设密码进行门禁控制,系统会验证密码的正确性。 状态显示:系统通过LCD显示屏显示当前状态,如刷卡成功、密码错误等。 灯光提示:灯亮表示开门成功,灯灭表示开门失败或未操作。 蜂鸣器报警:当刷卡或密码输入错误时,蜂鸣器会发出报警声,提示用户操作失败。 适用人群 电子工程、自动化等相关专业的学生和研究人员。 对单片机和RFID技术感兴趣的爱好者。 需要开发类似门禁系统的工程师和开发者。
评论
成就一亿技术人!
拼手气红包6.0元
还能输入1000个字符
 
红包 添加红包
表情包 插入表情
 条评论被折叠 查看
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值