top library bulletin
bar home editorial guideline content
dot
 
Volume 35 • Number 2 • 2012
 
• Scaling on Diagonal Quasi-Newton Update for Large-Scale Unconstrained Optimization
Wah June Leong, Mahboubeh Farid and Malik Abu Hassan

Abstract.
Diagonal quasi-Newton (DQN) methods are a class of quasi-Newton methods which alter the standard quasi-Newton updates of approximations to the Hessian or its inverse to diagonal updating matrices. Most often, the updating formulae for this class of methods are derived by the variational approach. A major drawback under this approach is that the derived diagonal matrix may suffer from the loss of positive definiteness and thus it may not be appropriate for use within a descent-gradient algorithm. Previous strategies to overcome this difficulty concentrated on skipping or restarting the non-descent steps. Doing so would abandon the second derivative information that is found on the previous step and consequently, the speed of convergence is usually slower than it would be without these remedies. Hence the present paper intends to propose a simple yet effective remedy to overcome the difficulty that gives arise non-positive-definite updating matrices in the variational based DQN methods. To this end we find that by incorporating an appropriate scaling for the diagonal updating, it improves step-wise convergence while avoiding non-positive definiteness of the updates. Finally, the new DQN method is tested for computational efficiency and stability on numerous test functions, and the numerical results indicate clear superiority over the current methods.

2010 Mathematics Subject Classification: Primary: 65L05; Secondary: 65F10.


Full text: PDF
 
dot