Tue.3 14:45–16:00 | H 3012 | ROB
.

Distributionally Robust Optimization

Chair: Soroosh Shafieezadeh Abadeh Organizer: Soroosh Shafieezadeh Abadeh
14:45

Tobias Sutter

joint work with Peyman Mohajerin Esfahani, Daniel Kuhn, John Lygeros

Data-driven Approximate Dynamic Programming

We consider linear programming (LP) problems in infinite dimensional spaces and develop an approximation bridge from the infinite-dimensional LP to tractable finite convex programs in which the performance of the approximation is quantified explicitly. We illustrate our theoretical results in the optimal control problem for Markov decision processes on Borel spaces when the dynamics are unknown and learned from data and derive a probabilistic explicit error bound between the data-driven finite convex program and the original infinite linear program.

15:10

Nian Si

joint work with Karthyek Murthy, Jose Blanchet

Asymptotic Normality and Optimal Confidence Regions in Wasserstein Distributionally Robust Optimization

We provide a theory of confidence regions for Wasserstein distributionally robust optimization (DRO) formulations. In addition to showing the asymptotic normality of these types of estimators (under natural convex constraints), we also characterize optimal confidence regions (in a certain sense informed by the underlying DRO formulation and study the asymptotic shape of these regions in a suitable topology defined for convergence of sets.

15:35

Soroosh Shafieezadeh Abadeh

joint work with Daniel Kuhn, Viet Anh Nguyen, Peyman Mohajerin Esfahani

Bridging Bayesian and Minimax Mean Square Error Estimation via Wasserstein Distributionally Robust Optimization

We bridge the Bayesian and the minimax mean square error estimators by leveraging tools from modern distributionally robust optimization. We show that the optimal estimator and the least favorable distribution form a Nash equilibrium for the Wasserstein ambiguity set and can be computed by solving a single tractable convex program. We further devise a Frank-Wolfe algorithm with a linear convergence rate to solve the convex program whose direction-searching subproblem can be solved in a quasi-closed form. Using these ingredients, we introduce an image denoiser and a robust Kalman filter.