Thu.1 09:00–10:15 | H 1029 | GLO
.

Global Optimization – Contributed Session 2/2

Chair: Ozan Evkaya
9:00

Majid Darehmiraki

joint work with Seyedeh Masoumeh Hosseini Nejad

The proximal gradient method for the subproblem of trust region method

Trust region algorithms are a class of numerical methods for nonlinear optimization problems, which have been extensively studied for many decades. Finding solution for the subproblem of the trust region method is a fundamental problem in nonlinear optimization. In this paper, using Tikhonov regularization, we first turn the constrained quadratic problem into an unconstrained problem, and then solve it by the proximal gradient (PG) method. The Proposed method is effective both for dense and large-sparse problems, including the so-called hard case.

9:25

José Fernández

joint work with Boglárka G.-Tóth, Laura Anton-Sanchez, Juana L. Redondo, Pilar M. Ortigosa

Firm expansion: how to invest? A MINLP model and B&B and heuristic procedures to deal with it

A chain wants to expand its presence in a given region, where it already owns some existing facilities. The chain can open a new facility and/or modify the qualities of some of its existing facilities and/or close some of them. In order to decide the location and quality of the new facility (in case it is open) as well as the new qualities for the existing facilities, a MINLP is formulated, whose objective is to maximize the profit obtained by the chain. Both an exact interval Branch-and-Bound method and a heuristic evolutionary algorithm are proposed to cope with this problem.

9:50

Ozan Evkaya

Parameter estimations of finite mixture models with particular optimization tools

Copulas can be easily incorporated into finite mixtures but as the structure of mixture models the gradient based optimization tools are not practical for the purpose of parameter estimation. This paper mainly investigates the performance of various optimization tools with different properties in terms of finding parameters of finite mixture models. In that respect, traditional gradient based, recently developed heuristic and global optimization algorithms are studied. For testing performance, considered problem with specific optimization routines are compared in terms of accuracy and run time