Tue.4 16:30–17:45 | H 1012 | GLO
.

Techniques in Global Optimization

Chair: Jon Lee Organizers: Jon Lee, Emily Speakman
16:30

Daphne Skipper

joint work with Jon Lee, Emily Speakman

Gaining and losing perspective

We discuss MINLO formulations of the disjunction $x\in\{0\}\cup[l,u]$, where $z$ is a binary indicator of $x\in[l,u]$, and $y$ captures the behavior of $x^p$, for $p>1$. This model is useful when activities have operating ranges and we pay a fixed cost for carrying out each activity. Using volume as a measure to compare convex bodies, we investigate a family of relaxations for this model, employing the inequality $yz^q \geq x^p$, parameterized by the ``lifting exponent'' $q\in [0,p-1]$. We analytically determine the behavior of these relaxations as functions of $l,u,p$ and $q$.

16:55

Claudia D'Ambrosio

joint work with Jon Lee, Daphne Skipper, Dimitri Thomopulos

Handling separable non-convexities with disjunctive cuts

The Sequential Convex Mixed Integer Non Linear Programming (SC-MINLP) method is an algorithmic approach for handling separable nonconvexities in the context of global optimization. The algorithm is based on a sequence of convex MINLPs providing lower bounds on the optimal solution value. Solving at each iteration a convex MINLP is the main bottleneck of SC-MINLP. We propose to improve this phase by employing disjunctive cuts and solving instead a sequence of convex NLPs. Computational results show the viability of our approach.

17:20

Leo Liberti

joint work with Pierre-Louis Poirion, Ky Vu, Claudia D'Ambrosio

Random projections for quadratic optimization

Random projections are used as dimensional reduction techniques in many situations. They project a set of points in a high dimensional space to a lower dimensional one while approximately preserving all pairwise Euclidean distances. Usually, random projections are applied to numerical data. In this talk we present a successful application of random projections to quadratic programming problems. We derive approximate feasibility and optimality results for the lower dimensional problem, and showcase some computational experiments illustrating the usefulness of our techniques.