Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization

Yasushi Narushima, Hiroshi Yabe

Research output: Contribution to journalArticle

30 Citations (Scopus)

Abstract

Conjugate gradient methods have been paid attention to, because they can be directly applied to large-scale unconstrained optimization problems. In order to incorporate second order information of the objective function into conjugate gradient methods, Dai and Liao (2001) proposed a conjugate gradient method based on the secant condition. However, their method does not necessarily generate a descent search direction. On the other hand, Hager and Zhang (2005) proposed another conjugate gradient method which always generates a descent search direction. In this paper, combining Dai-Liao's idea and Hager-Zhang's idea, we propose conjugate gradient methods based on secant conditions that generate descent search directions. In addition, we prove global convergence properties of the proposed methods. Finally, preliminary numerical results are given.

Original languageEnglish
Pages (from-to)4303-4317
Number of pages15
JournalJournal of Computational and Applied Mathematics
Volume236
Issue number17
DOIs
Publication statusPublished - 2012 Nov 1
Externally publishedYes

Fingerprint

Conjugate gradient method
Unconstrained Optimization
Conjugate Gradient Method
Chord or secant line
Descent
Large-scale Optimization
Global Convergence
Convergence Properties
Objective function
Optimization Problem
Numerical Results

Keywords

  • Conjugate gradient method
  • Descent search direction
  • Global convergence
  • Secant condition
  • Unconstrained optimization

ASJC Scopus subject areas

  • Computational Mathematics
  • Applied Mathematics

Cite this

@article{65bb5abc2c9e472a81a485bd018fab03,
title = "Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization",
abstract = "Conjugate gradient methods have been paid attention to, because they can be directly applied to large-scale unconstrained optimization problems. In order to incorporate second order information of the objective function into conjugate gradient methods, Dai and Liao (2001) proposed a conjugate gradient method based on the secant condition. However, their method does not necessarily generate a descent search direction. On the other hand, Hager and Zhang (2005) proposed another conjugate gradient method which always generates a descent search direction. In this paper, combining Dai-Liao's idea and Hager-Zhang's idea, we propose conjugate gradient methods based on secant conditions that generate descent search directions. In addition, we prove global convergence properties of the proposed methods. Finally, preliminary numerical results are given.",
keywords = "Conjugate gradient method, Descent search direction, Global convergence, Secant condition, Unconstrained optimization",
author = "Yasushi Narushima and Hiroshi Yabe",
year = "2012",
month = "11",
day = "1",
doi = "10.1016/j.cam.2012.01.036",
language = "English",
volume = "236",
pages = "4303--4317",
journal = "Journal of Computational and Applied Mathematics",
issn = "0377-0427",
publisher = "Elsevier",
number = "17",

}

TY - JOUR

T1 - Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization

AU - Narushima, Yasushi

AU - Yabe, Hiroshi

PY - 2012/11/1

Y1 - 2012/11/1

N2 - Conjugate gradient methods have been paid attention to, because they can be directly applied to large-scale unconstrained optimization problems. In order to incorporate second order information of the objective function into conjugate gradient methods, Dai and Liao (2001) proposed a conjugate gradient method based on the secant condition. However, their method does not necessarily generate a descent search direction. On the other hand, Hager and Zhang (2005) proposed another conjugate gradient method which always generates a descent search direction. In this paper, combining Dai-Liao's idea and Hager-Zhang's idea, we propose conjugate gradient methods based on secant conditions that generate descent search directions. In addition, we prove global convergence properties of the proposed methods. Finally, preliminary numerical results are given.

AB - Conjugate gradient methods have been paid attention to, because they can be directly applied to large-scale unconstrained optimization problems. In order to incorporate second order information of the objective function into conjugate gradient methods, Dai and Liao (2001) proposed a conjugate gradient method based on the secant condition. However, their method does not necessarily generate a descent search direction. On the other hand, Hager and Zhang (2005) proposed another conjugate gradient method which always generates a descent search direction. In this paper, combining Dai-Liao's idea and Hager-Zhang's idea, we propose conjugate gradient methods based on secant conditions that generate descent search directions. In addition, we prove global convergence properties of the proposed methods. Finally, preliminary numerical results are given.

KW - Conjugate gradient method

KW - Descent search direction

KW - Global convergence

KW - Secant condition

KW - Unconstrained optimization

UR - http://www.scopus.com/inward/record.url?scp=84862841641&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84862841641&partnerID=8YFLogxK

U2 - 10.1016/j.cam.2012.01.036

DO - 10.1016/j.cam.2012.01.036

M3 - Article

AN - SCOPUS:84862841641

VL - 236

SP - 4303

EP - 4317

JO - Journal of Computational and Applied Mathematics

JF - Journal of Computational and Applied Mathematics

SN - 0377-0427

IS - 17

ER -