Wed.2 14:15–15:30 | H 1029 | NON
.

Advances in Nonlinear Optimization Algorithms (1/2)

Chair: Mehiddin Al-Baali
14:15

Johannes Brust

joint work with Roummel Marcia, Cosmin Petra

Large-Scale Quasi-Newton Trust-Region Methods with Low-Dimensional Equality Constraints

For large-scale optimization problems, limited-memory quasi-Newton methods are efficient means to estimate Hessian matrices. We present two L-BFGS trust-region methods for large optimization problems with a low number of linear equality constraints. Both methods exploit a compact representation of the (1,1) block of the inverse KKT matrix. One of the proposed methods uses an implicit eigendecomposition in order to compute search directions by an analytic formula, and the other uses a 1D root solver. We compare the methods to alternative quasi-Newton algorithms on a set of CUTEst problems.

14:40

Hardik Tankaria

joint work with Nobuo Yamashita

Non-monotone regularized limited memory BFGS method with line search for unconstrained optimization

Recently, Sugimoto and Yamashita proposed the regularized limited memory BFGS (RL-BFGS) with non-monotone technique for unconstrained optimization and reported that it is competitive to the standard L-BFGS. However, RL-BFGS does not use line search; hence it cannot take longer step. In order to take longer step, we propose to use line search with Wolfe condition when iteration of RL-BFGS is successful and the search direction is still descent direction. The numerical results shows that RL-BFGS with the proposed technique is more efficient and stable than the standard L-BFGS and RL-BFGS method.

15:05

Mehiddin Al-Baali

Conjugate Gradient Methods with Quasi-Newton Features for Unconstrained Optimization

The recent class of symmetric conjugate gradient methods for large-scale unconstrained optimization will be considered. The quasi-Newton like condition will be introduced to this class of methods in a sense to be defined. It will be shown that the proposed methods converge globally and has some useful features. Numerical results will be described to illustrate the behavior of some new techniques for improving the performance of several conjugate gradient methods substantially.