1 Star 0 Fork 8

Duanshuiliu2020/Optimization_Algorithm

加入 Gitee
与超过 1200万 开发者一起发现、参与优秀开源项目,私有仓库也完全免费 :)
免费加入
文件
该仓库未声明开源许可证文件(LICENSE),使用请关注具体项目描述及其代码上游依赖。
克隆/下载
python_Newton.py 1.13 KB
一键复制 编辑 原始数据 按行查看 历史
GaoBoYu599 提交于 2018-12-31 10:59 . Add files via upload
import sympy
import numpy as np
import time
x,y = sympy.symbols('x y')
f = (x+y)**2 + (x+1)**2 + (y+3)**2
# 一阶导数:
fx = sympy.diff(f,x)
fy = sympy.diff(f,y)
# 二阶导数:
fxx = sympy.diff(fx,x)
fyy = sympy.diff(fy,y)
fxy = sympy.diff(fx,y)
fyx = sympy.diff(fy,x)
grad_f1 = np.array([[fx],[fy]])
grad_H2 = np.array([[float(fxx),float(fxy)],
[float(fyx),float(fyy)]])
# 参数设置:
acc = 0.001
x_tmp = 10
y_tmp = -1.5
k = 0 # 迭代次数计数器
print('牛顿下降开始:\n')
while 1:
grad_f1 = np.array([[float(fx.evalf(subs={x:x_tmp,y:y_tmp}))],
[float(fy.evalf(subs={x:x_tmp,y:y_tmp}))]])
ans_tmp = np.array([[x_tmp],[y_tmp]]) - np.dot(np.linalg.inv(grad_H2),grad_f1)
acc_tmp = ( (ans_tmp[0,0]-x_tmp)**2 + (ans_tmp[1,0]-y_tmp)**2 )**0.5
if acc_tmp <= acc:
print('极值坐标为:(%.5f,%.5f,%.5f)'%(ans_tmp[0,0],ans_tmp[1,0],f_tmp))
print('迭代次数:%d'%(k))
break
x_tmp = ans_tmp[0,0]
y_tmp = ans_tmp[1,0]
f_tmp = (x_tmp+y_tmp)**2 + (x_tmp+1)**2 + (y_tmp+3)**2
k = k + 1
马建仓 AI 助手
尝试更多
代码解读
代码找茬
代码优化
Matlab
1
https://gitee.com/duanshuiliu2020/Optimization_Algorithm.git
git@gitee.com:duanshuiliu2020/Optimization_Algorithm.git
duanshuiliu2020
Optimization_Algorithm
Optimization_Algorithm
master

搜索帮助

0d507c66 1850385 C8b1a773 1850385