Multi-step nonlinear conjugate gradient methods for unconstrained minimization

John A. Ford, Yasushi Narushima, Hiroshi Yabe

Research output: Contribution to journalArticle

28 Citations (Scopus)

Abstract

Conjugate gradient methods are appealing for large scale nonlinear optimization problems, because they avoid the storage of matrices. Recently, seeking fast convergence of these methods, Dai and Liao (Appl. Math. Optim. 43:87-101, 2001) proposed a conjugate gradient method based on the secant condition of quasi-Newton methods, and later Yabe and Takano (Comput. Optim. Appl. 28:203-225, 2004) proposed another conjugate gradient method based on the modified secant condition. In this paper, we make use of a multi-step secant condition given by Ford and Moghrabi (Optim. Methods Softw. 2:357-370, 1993; J. Comput. Appl. Math. 50:305-323, 1994) and propose two new conjugate gradient methods based on this condition. The methods are shown to be globally convergent under certain assumptions. Numerical results are reported.

Original languageEnglish
Pages (from-to)191-216
Number of pages26
JournalComputational Optimization and Applications
Volume40
Issue number2
DOIs
Publication statusPublished - 2008 Jun 1
Externally publishedYes

Fingerprint

Unconstrained Minimization
Conjugate gradient method
Conjugate Gradient Method
Chord or secant line
Large-scale Optimization
Quasi-Newton Method
Newton-Raphson method
Nonlinear Optimization
Nonlinear Problem
Optimization Problem
Numerical Results

Keywords

  • Conjugate gradient method
  • Global convergence
  • Line search
  • Multi-step secant condition
  • Unconstrained optimization

ASJC Scopus subject areas

  • Control and Optimization
  • Computational Mathematics
  • Applied Mathematics

Cite this

Multi-step nonlinear conjugate gradient methods for unconstrained minimization. / Ford, John A.; Narushima, Yasushi; Yabe, Hiroshi.

In: Computational Optimization and Applications, Vol. 40, No. 2, 01.06.2008, p. 191-216.

Research output: Contribution to journalArticle

@article{7d25879a78694235b5506ca3751fe43b,
title = "Multi-step nonlinear conjugate gradient methods for unconstrained minimization",
abstract = "Conjugate gradient methods are appealing for large scale nonlinear optimization problems, because they avoid the storage of matrices. Recently, seeking fast convergence of these methods, Dai and Liao (Appl. Math. Optim. 43:87-101, 2001) proposed a conjugate gradient method based on the secant condition of quasi-Newton methods, and later Yabe and Takano (Comput. Optim. Appl. 28:203-225, 2004) proposed another conjugate gradient method based on the modified secant condition. In this paper, we make use of a multi-step secant condition given by Ford and Moghrabi (Optim. Methods Softw. 2:357-370, 1993; J. Comput. Appl. Math. 50:305-323, 1994) and propose two new conjugate gradient methods based on this condition. The methods are shown to be globally convergent under certain assumptions. Numerical results are reported.",
keywords = "Conjugate gradient method, Global convergence, Line search, Multi-step secant condition, Unconstrained optimization",
author = "Ford, {John A.} and Yasushi Narushima and Hiroshi Yabe",
year = "2008",
month = "6",
day = "1",
doi = "10.1007/s10589-007-9087-z",
language = "English",
volume = "40",
pages = "191--216",
journal = "Computational Optimization and Applications",
issn = "0926-6003",
publisher = "Springer Netherlands",
number = "2",

}

TY - JOUR

T1 - Multi-step nonlinear conjugate gradient methods for unconstrained minimization

AU - Ford, John A.

AU - Narushima, Yasushi

AU - Yabe, Hiroshi

PY - 2008/6/1

Y1 - 2008/6/1

N2 - Conjugate gradient methods are appealing for large scale nonlinear optimization problems, because they avoid the storage of matrices. Recently, seeking fast convergence of these methods, Dai and Liao (Appl. Math. Optim. 43:87-101, 2001) proposed a conjugate gradient method based on the secant condition of quasi-Newton methods, and later Yabe and Takano (Comput. Optim. Appl. 28:203-225, 2004) proposed another conjugate gradient method based on the modified secant condition. In this paper, we make use of a multi-step secant condition given by Ford and Moghrabi (Optim. Methods Softw. 2:357-370, 1993; J. Comput. Appl. Math. 50:305-323, 1994) and propose two new conjugate gradient methods based on this condition. The methods are shown to be globally convergent under certain assumptions. Numerical results are reported.

AB - Conjugate gradient methods are appealing for large scale nonlinear optimization problems, because they avoid the storage of matrices. Recently, seeking fast convergence of these methods, Dai and Liao (Appl. Math. Optim. 43:87-101, 2001) proposed a conjugate gradient method based on the secant condition of quasi-Newton methods, and later Yabe and Takano (Comput. Optim. Appl. 28:203-225, 2004) proposed another conjugate gradient method based on the modified secant condition. In this paper, we make use of a multi-step secant condition given by Ford and Moghrabi (Optim. Methods Softw. 2:357-370, 1993; J. Comput. Appl. Math. 50:305-323, 1994) and propose two new conjugate gradient methods based on this condition. The methods are shown to be globally convergent under certain assumptions. Numerical results are reported.

KW - Conjugate gradient method

KW - Global convergence

KW - Line search

KW - Multi-step secant condition

KW - Unconstrained optimization

UR - http://www.scopus.com/inward/record.url?scp=42149114107&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=42149114107&partnerID=8YFLogxK

U2 - 10.1007/s10589-007-9087-z

DO - 10.1007/s10589-007-9087-z

M3 - Article

VL - 40

SP - 191

EP - 216

JO - Computational Optimization and Applications

JF - Computational Optimization and Applications

SN - 0926-6003

IS - 2

ER -