In this paper we study new preconditioners for the Nonlinear Conjugate Gradient (NCG) method in large scale unconstrained optimization. Our preconditioners are based on quasi--Newton updates, which approximate the inverse of the Hessian matrix. In particular, we consider a couple of new low-rank quasi-Newton symmetric updating formulae. Some preliminary numerical experiences are carried on, showing a comparison between one of our proposals and the use of L-BFGS update as preconditioner.
Quasi–Newton updates for Preconditioned Nonlinear Conjugate Gradient methods
FASANO, Giovanni;
2012-01-01
Abstract
In this paper we study new preconditioners for the Nonlinear Conjugate Gradient (NCG) method in large scale unconstrained optimization. Our preconditioners are based on quasi--Newton updates, which approximate the inverse of the Hessian matrix. In particular, we consider a couple of new low-rank quasi-Newton symmetric updating formulae. Some preliminary numerical experiences are carried on, showing a comparison between one of our proposals and the use of L-BFGS update as preconditioner.File in questo prodotto:
File | Dimensione | Formato | |
---|---|---|---|
FaRo_NCG.pdf
accesso aperto
Tipologia:
Documento in Pre-print
Licenza:
Accesso libero (no vincoli)
Dimensione
189.11 kB
Formato
Adobe PDF
|
189.11 kB | Adobe PDF | Visualizza/Apri |
I documenti in ARCA sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.