Conjugate gradient methods using value of objective function for unconstrained optimization

Hideaki Iiduka, Yasushi Narushima

Research output: Contribution to journalArticle

7 Citations (Scopus)

Abstract

Conjugate gradient methods have been widely used as schemes to solve large-scale unconstrained optimization problems. The search directions for the conventional methods are defined by using the gradient of the objective function. This paper proposes two nonlinear conjugate gradient methods which take into account mostly information about the objective function. We prove that they converge globally and numerically compare them with conventional methods. The results show that with slight modification to the direction, one of our methods performs as well as the best conventional method employing the Hestenes-Stiefel formula.

Original languageEnglish
Pages (from-to)941-955
Number of pages15
JournalOptimization Letters
Volume6
Issue number5
DOIs
Publication statusPublished - 2012 Jan 1
Externally publishedYes

Fingerprint

Unconstrained Optimization
Conjugate Gradient Method
Objective function
Large-scale Optimization
Gradient
Optimization Problem
Converge

Keywords

  • Conjugate gradient method
  • Global convergence
  • Unconstrained optimization problem
  • Wolfe conditions

ASJC Scopus subject areas

  • Control and Optimization

Cite this

Conjugate gradient methods using value of objective function for unconstrained optimization. / Iiduka, Hideaki; Narushima, Yasushi.

In: Optimization Letters, Vol. 6, No. 5, 01.01.2012, p. 941-955.

Research output: Contribution to journalArticle

@article{28ca710bb8544ea6a682080053f090b3,
title = "Conjugate gradient methods using value of objective function for unconstrained optimization",
abstract = "Conjugate gradient methods have been widely used as schemes to solve large-scale unconstrained optimization problems. The search directions for the conventional methods are defined by using the gradient of the objective function. This paper proposes two nonlinear conjugate gradient methods which take into account mostly information about the objective function. We prove that they converge globally and numerically compare them with conventional methods. The results show that with slight modification to the direction, one of our methods performs as well as the best conventional method employing the Hestenes-Stiefel formula.",
keywords = "Conjugate gradient method, Global convergence, Unconstrained optimization problem, Wolfe conditions",
author = "Hideaki Iiduka and Yasushi Narushima",
year = "2012",
month = "1",
day = "1",
doi = "10.1007/s11590-011-0324-0",
language = "English",
volume = "6",
pages = "941--955",
journal = "Optimization Letters",
issn = "1862-4472",
publisher = "Springer Verlag",
number = "5",

}

TY - JOUR

T1 - Conjugate gradient methods using value of objective function for unconstrained optimization

AU - Iiduka, Hideaki

AU - Narushima, Yasushi

PY - 2012/1/1

Y1 - 2012/1/1

N2 - Conjugate gradient methods have been widely used as schemes to solve large-scale unconstrained optimization problems. The search directions for the conventional methods are defined by using the gradient of the objective function. This paper proposes two nonlinear conjugate gradient methods which take into account mostly information about the objective function. We prove that they converge globally and numerically compare them with conventional methods. The results show that with slight modification to the direction, one of our methods performs as well as the best conventional method employing the Hestenes-Stiefel formula.

AB - Conjugate gradient methods have been widely used as schemes to solve large-scale unconstrained optimization problems. The search directions for the conventional methods are defined by using the gradient of the objective function. This paper proposes two nonlinear conjugate gradient methods which take into account mostly information about the objective function. We prove that they converge globally and numerically compare them with conventional methods. The results show that with slight modification to the direction, one of our methods performs as well as the best conventional method employing the Hestenes-Stiefel formula.

KW - Conjugate gradient method

KW - Global convergence

KW - Unconstrained optimization problem

KW - Wolfe conditions

UR - http://www.scopus.com/inward/record.url?scp=85027927880&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85027927880&partnerID=8YFLogxK

U2 - 10.1007/s11590-011-0324-0

DO - 10.1007/s11590-011-0324-0

M3 - Article

AN - SCOPUS:85027927880

VL - 6

SP - 941

EP - 955

JO - Optimization Letters

JF - Optimization Letters

SN - 1862-4472

IS - 5

ER -