Levenberg-Marquardt LM算法 的理解
1. convex optimization
先简单介绍一下凸优化的内容
First quickly introduction the convex optimization concepts.
1.1 convex set
Definition:
for x 1 , x 2 ∈ A x_{1}, x_{2} \in A x1,x2∈A, we say A A A is a convex set if and only if the following holds for any θ ∈ [ 0 , 1 ] \theta \in [0,1] θ∈[0,1]:
θ x 1 + ( 1 − θ ) x 2 ∈ A \theta x_{1} + (1- \theta)x_{2} \in A θx1+(1−θ)x2∈A

凸函数的定义如上,不过可以通过下面的图简单理解。可以直觉得说,如果两点集合内的点的“连线“上的点都在集合内,那么它就是一个凸集合。
We can quickly understand it by the following graph. If all the elements between x 1 x_{1} x1 and x 2 x_{2} x2 are in the set A A A, we could say it is a convex set.

1.2 convex function
Definition:
we say f ( x ) f(x) f(x) with domain d o m ( f ) dom(f) dom(f) is a convex function, if and only if , the following conditions holds for any θ ∈ [ 0 , 1 ] \theta \in [0,1] θ∈[0,1]:
f ( θ x 1 + ( 1 − θ ) x 2 ) ≤ θ f ( x 1 ) + ( 1 − θ ) f ( x 2 ) f(\theta x_{1} + (1-\theta)x_{2}) \le \theta f(x_{1}) + (1-\theta) f(x_{2}) f(
理解Levenberg-Marquardt算法在SLAM中的应用

本文介绍了凸优化的概念,包括凸集、凸函数和优化问题,强调了在凸优化中局部最小值即为全局最小值。接着探讨了SLAM中的束调整问题,涉及Gauss-Newton法和LM算法,特别是LM法中的阻尼因子和正则化的作用,以及μ的选择与Hessian矩阵的关系。
最低0.47元/天 解锁文章
1345

被折叠的 条评论
为什么被折叠?



