Wed.1 11:30–12:45 | H 1029 | NON
.

Practical Aspects of Nonlinear Optimization

Chair: Tim Mitchell Organizer: Coralia Cartis
11:30

Hiroshige Dan

joint work with Koei Sakiyama

Automatic Differentiation Software for Large-scale Nonlinear Optimization Problems

Automatic differentiation (AD) is essential for solving large-scale nonlinear optimization problems. In this research, we implement an AD software which is particularly designed for large-scale nonlinear optimization problems (NLPs). Large-scale NLPs are usually expressed by indexed variables and functions and our AD software can deal with such indexed elements effectively. Numerical experiments show the superiority of our AD software for large-scale NLPs. Moreover, our AD software can compute partial derivatives with arbitrary precision arithmetic.

11:55

Michael Feldmeier

structure detection in generic quadratic programming

Benders' decomposition is a popular choice when solving optimization problems with complicating variables. Implementations are often problem dependent, or at least require the user to specify the master and sub-problems. In the past, attempts have been made to detect automatically suitable structure in the constraint matrix of generic (mixed-integer) linear programmes. In this talk we discuss an extension of this approach to quadratic programming and will cover different approaches to finding suitable structure in the constraint and Hessian matrices. Initial numerical results will be given.

12:20

Tim Mitchell

joint work with Frank E. Curtis, Michael Overton

Relative Minimization Profiles: A Different Perspective for Benchmarking Optimization and Other Numerical Software

We present relative minimization profiles (RMPs) for benchmarking general optimization methods and other numerical algorithms. Many other visualization tools, e.g., performance profiles, consider success as binary: was a problem “solved” or not. RMPs instead consider degrees of success, from obtaining solutions to machine precision to just finding feasible ones. RMPs concisely show how methods compare in terms of the amount of minimization or accuracy attained, finding feasible solutions, and speed of progress, i.e., how overall solution quality changes for different computational budgets.