抄録
Conjugate gradient methods have been widely used as schemes to solve large-scale unconstrained optimization problems. The search directions for the conventional methods are defined by using the gradient of the objective function. This paper proposes two nonlinear conjugate gradient methods which take into account mostly information about the objective function. We prove that they converge globally and numerically compare them with conventional methods. The results show that with slight modification to the direction, one of our methods performs as well as the best conventional method employing the Hestenes-Stiefel formula.
本文言語 | English |
---|---|
ページ(範囲) | 941-955 |
ページ数 | 15 |
ジャーナル | Optimization Letters |
巻 | 6 |
号 | 5 |
DOI | |
出版ステータス | Published - 2012 6月 |
外部発表 | はい |
ASJC Scopus subject areas
- 制御と最適化