一个比较适合warm up的问题
二分法实现
sqrt的实现有两种思路,第一种是使用二分法,类似心算,即取输入的一半,若平方后大于输入,则再取一半,若平方小于输入,则取上次和当前估计的平均,如此逼近,但是要注意0-1区间,需改变求值与左右界的更新方式。
epsilon = 1e-9
def sqrt_Bisection(y):
if y < 0.0:
raise ValueError("input is not non-negative")
if y == 0.0 or y == 1.0:
return y
if y < 1:
left = y
right = 2 * y
mid = (left + right) / 2
res = mid ** 2 - y
while abs(res) > epsilon:
if res < 0:
left = mid
right = 2 * mid
elif res > 0:
right = mid
mid = (left + right) / 2
res = mid ** 2 - y
return mid
if y > 1:
left = 0
right = y
mid = (left + right) / 2
res = mid ** 2 - y
while abs(res) > epsilon:
if res < 0:
left = mid
elif res > 0:
right = mid
mid = (left + right) / 2
res = mid ** 2 - y
return mid
数值优化思路
这可视为最优估计问题,
y=f(P)
y=f(P)
y=f(P)
求给定观测y下参数P的最优估计,
f(P)=P2
f(P)=P^2
f(P)=P2
此问题,高斯牛顿法与梯度下降法得出同样的形式,
ϵ=f(P)−yΔ=−J−1ϵJ=2PP=P+Δ
\epsilon=f(P)-y \\
\Delta = -J^{-1}\epsilon \\
J = 2P \\
P = P+\Delta
ϵ=f(P)−yΔ=−J−1ϵJ=2PP=P+Δ
epsilon = 1e-9
def sqrt_Gradient(y):
if y < 0:
raise ValueError("input is not non-negative")
if y == 0.0 or y == 1.0:
return y
x = y / 2
# $$/delta = -inv(J)/epsilon$$
delta = (y - x ** 2) / (2 * x)
while abs(y - x ** 2) > epsilon:
x += delta
delta = -(x ** 2 - y) / (2 * x)
return x
二分法与梯度法求平方根
2279

被折叠的 条评论
为什么被折叠?



