Nonlinear conjugate gradient methods with structured secant condition for nonlinear least squares problems

Michiya Kobayashi, Yasushi Narushima, Hiroshi Yabe

Research output: Contribution to journalArticle

14 Citations (Scopus)

Abstract

In this paper, we deal with conjugate gradient methods for solving nonlinear least squares problems. Several Newton-like methods have been studied for solving nonlinear least squares problems, which include the Gauss-Newton method, the Levenberg-Marquardt method and the structured quasi-Newton methods. On the other hand, conjugate gradient methods are appealing for general large-scale nonlinear optimization problems. By combining the structured secant condition and the idea of Dai and Liao (2001) [20], the present paper proposes conjugate gradient methods that make use of the structure of the Hessian of the objective function of nonlinear least squares problems. The proposed methods are shown to be globally convergent under some assumptions. Finally, some numerical results are given.

Original languageEnglish
Pages (from-to)375-397
Number of pages23
JournalJournal of Computational and Applied Mathematics
Volume234
Issue number2
DOIs
Publication statusPublished - 2010 May 15
Externally publishedYes

Fingerprint

Nonlinear Least Squares Problem
Conjugate gradient method
Conjugate Gradient Method
Chord or secant line
Newton-Raphson method
Levenberg-Marquardt Method
Gauss-Newton Method
Newton-like Method
Large-scale Optimization
Quasi-Newton Method
Nonlinear Optimization
Nonlinear Problem
Objective function
Optimization Problem
Numerical Results

Keywords

  • Conjugate gradient method
  • Global convergence
  • Least squares problems
  • Line search
  • Structured secant condition

ASJC Scopus subject areas

  • Computational Mathematics
  • Applied Mathematics

Cite this

Nonlinear conjugate gradient methods with structured secant condition for nonlinear least squares problems. / Kobayashi, Michiya; Narushima, Yasushi; Yabe, Hiroshi.

In: Journal of Computational and Applied Mathematics, Vol. 234, No. 2, 15.05.2010, p. 375-397.

Research output: Contribution to journalArticle

@article{9601c8415e33477dbf12c63d4aac8e17,
title = "Nonlinear conjugate gradient methods with structured secant condition for nonlinear least squares problems",
abstract = "In this paper, we deal with conjugate gradient methods for solving nonlinear least squares problems. Several Newton-like methods have been studied for solving nonlinear least squares problems, which include the Gauss-Newton method, the Levenberg-Marquardt method and the structured quasi-Newton methods. On the other hand, conjugate gradient methods are appealing for general large-scale nonlinear optimization problems. By combining the structured secant condition and the idea of Dai and Liao (2001) [20], the present paper proposes conjugate gradient methods that make use of the structure of the Hessian of the objective function of nonlinear least squares problems. The proposed methods are shown to be globally convergent under some assumptions. Finally, some numerical results are given.",
keywords = "Conjugate gradient method, Global convergence, Least squares problems, Line search, Structured secant condition",
author = "Michiya Kobayashi and Yasushi Narushima and Hiroshi Yabe",
year = "2010",
month = "5",
day = "15",
doi = "10.1016/j.cam.2009.12.031",
language = "English",
volume = "234",
pages = "375--397",
journal = "Journal of Computational and Applied Mathematics",
issn = "0377-0427",
publisher = "Elsevier",
number = "2",

}

TY - JOUR

T1 - Nonlinear conjugate gradient methods with structured secant condition for nonlinear least squares problems

AU - Kobayashi, Michiya

AU - Narushima, Yasushi

AU - Yabe, Hiroshi

PY - 2010/5/15

Y1 - 2010/5/15

N2 - In this paper, we deal with conjugate gradient methods for solving nonlinear least squares problems. Several Newton-like methods have been studied for solving nonlinear least squares problems, which include the Gauss-Newton method, the Levenberg-Marquardt method and the structured quasi-Newton methods. On the other hand, conjugate gradient methods are appealing for general large-scale nonlinear optimization problems. By combining the structured secant condition and the idea of Dai and Liao (2001) [20], the present paper proposes conjugate gradient methods that make use of the structure of the Hessian of the objective function of nonlinear least squares problems. The proposed methods are shown to be globally convergent under some assumptions. Finally, some numerical results are given.

AB - In this paper, we deal with conjugate gradient methods for solving nonlinear least squares problems. Several Newton-like methods have been studied for solving nonlinear least squares problems, which include the Gauss-Newton method, the Levenberg-Marquardt method and the structured quasi-Newton methods. On the other hand, conjugate gradient methods are appealing for general large-scale nonlinear optimization problems. By combining the structured secant condition and the idea of Dai and Liao (2001) [20], the present paper proposes conjugate gradient methods that make use of the structure of the Hessian of the objective function of nonlinear least squares problems. The proposed methods are shown to be globally convergent under some assumptions. Finally, some numerical results are given.

KW - Conjugate gradient method

KW - Global convergence

KW - Least squares problems

KW - Line search

KW - Structured secant condition

UR - http://www.scopus.com/inward/record.url?scp=77649271327&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=77649271327&partnerID=8YFLogxK

U2 - 10.1016/j.cam.2009.12.031

DO - 10.1016/j.cam.2009.12.031

M3 - Article

AN - SCOPUS:77649271327

VL - 234

SP - 375

EP - 397

JO - Journal of Computational and Applied Mathematics

JF - Journal of Computational and Applied Mathematics

SN - 0377-0427

IS - 2

ER -