节点文献
一类混合共轭梯度算法
A Class Hybrid Conjugate Gradient Methods
【摘要】 改进了戴志锋,陈兰平提出的HS-DY混合共轭梯度法,扩大了参数kβ的取值范围,基于同样的考虑,给出了DY与PRP算法相结合的混合共轭梯度法,在Wolfe线搜索下不需给定下降条件即证明了它们的全局收敛性.数值实验表明这类的算法十分有效.
【Abstract】 In this paper,we improve the hybrid of conjugate gradient method for unconstrained optimization based on Hestenes-stiefel Algorithms and Dai-Yuan Algorithms.This paper allows β_k to be selected in a wider than Dai-Yuan Algorithms.Based the same thinking,we propose a mixed conjugate gradient method for unconstrained optimization based on Polak-Ribière-Polyak Algorithms and Dai-Yuan Algorithms,which has taken the advantages of two Algorithms.We proved they can ensure the convergence of the new methods under the Wolfe line search and withouth the descent condition.Numerical experiments show that the algoriths are efficient by comparing with HS conjugate gradient method and PR conjugate gradient method.
【Key words】 Unconstrained optimization; conjugate gradient method; Wolfe line search; global convergence;
- 【文献出处】 首都师范大学学报(自然科学版) ,Journal of Capital Normal University(Natural Science Edition) , 编辑部邮箱 ,2007年02期
- 【分类号】O221.1
- 【被引频次】9
- 【下载频次】170