QR & RQ Factorization

本文介绍了如何使用SciPy库进行RQ分解,并提供了一个自行实现RQ分解的例子。此外,还详细解释了如何确保分解得到的矩阵对角线元素为正数,这对于计算机视觉中的应用尤为重要。

from the documentation (here is a page that shows it though). To use this version, import rq like this:


from scipy.linalg import rq


Alternatively, you can use the more common QR factorization and with some modifications write your own RQ function. 


from scipy.linalg import qr

def rq(A):
Q,R = qr(flipud(A).T)
R = flipud(R.T)
Q = Q.T
return R[:,::-1],Q[::-1,:]


RQ factorization is not unique. The sign of the diagonal elements can vary. In computer vision we need them to be positive to correspond to focal length and other positive parameters. To get a consistent result with positive diagonal you can apply a transform that changes the sign. Try this on a camera matrix like this:


# factor first 3*3 part of P
K,R = rq(P[:,:3])

# make diagonal of K positive
T = diag(sign(diag(K)))

K = dot(K,T)
R = dot(T,R) #T is its own inverse


The RQ decomposition transforms a matrix A into the product of an upper triangular matrix R (also known as right-triangular) and an orthogonal matrix Q. The only difference from QR decomposition is the order of these matrices.

QR decomposition is Gram–Schmidt orthogonalization of columns of A, started from the first column.

RQ decomposition is Gram–Schmidt orthogonalization of rows of A, started from the last row.

转载于:https://www.cnblogs.com/ShaneZhang/archive/2013/06/13/3134655.html

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值