We consider an iterative computation of negative curvature directions, in large-scale unconstrained optimization frameworks, needed for ensuring the convergence toward stationary points which satisfy second-order necessary optimality conditions.We show that to the latter purpose, we can fruitfully couple the conjugate gradient (CG) method with a recently introduced approach involving the use of the numeral called Grossone. In particular, recalling that in principle the CG method is well posed only when solving positive definite linear systems, our proposal exploits the use of grossone to enhance the performance of the CG, allowing the computation of negative curvature directions in the indefinite case, too. Our overall method could be used to significantly generalize the theory in state-of-the-art literature. Moreover, it straightforwardly allows the solution of Newton’s equation in optimization frameworks, even in nonconvex problems. We remark that our iterative procedure to compute a negative curvature direction does not require the storage of any matrix, simply needing to store a couple of vectors. This definitely represents an advance with respect to current results in the literature.

Iterative Grossone-Based Computation of Negative Curvature Directions in Large-Scale Optimization

Fasano, Giovanni
;
2020-01-01

Abstract

We consider an iterative computation of negative curvature directions, in large-scale unconstrained optimization frameworks, needed for ensuring the convergence toward stationary points which satisfy second-order necessary optimality conditions.We show that to the latter purpose, we can fruitfully couple the conjugate gradient (CG) method with a recently introduced approach involving the use of the numeral called Grossone. In particular, recalling that in principle the CG method is well posed only when solving positive definite linear systems, our proposal exploits the use of grossone to enhance the performance of the CG, allowing the computation of negative curvature directions in the indefinite case, too. Our overall method could be used to significantly generalize the theory in state-of-the-art literature. Moreover, it straightforwardly allows the solution of Newton’s equation in optimization frameworks, even in nonconvex problems. We remark that our iterative procedure to compute a negative curvature direction does not require the storage of any matrix, simply needing to store a couple of vectors. This definitely represents an advance with respect to current results in the literature.
File in questo prodotto:
File Dimensione Formato  
Leone_et_al-2020-Journal_of_Optimization_Theory_and_Applications.pdf

accesso aperto

Descrizione: Articolo pubblicato
Tipologia: Versione dell'editore
Licenza: Creative commons
Dimensione 816.9 kB
Formato Adobe PDF
816.9 kB Adobe PDF Visualizza/Apri

I documenti in ARCA sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10278/3728921
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 11
  • ???jsp.display-item.citation.isi??? 10
social impact