joint work with Irene Fonseca, Pan Liu
In this talk we will introduce a novel class of regularizers, providing a unified approach to standard regularizers such as TV and TGV^2. By means of a bilevel training scheme we will simultaneously identify optimal parameters and regularizers for any given class of training imaging data. Existence of solutions to the training scheme will be shown via Gamma-convergence. Finally, we will provide some explicit examples and numerical results. This is joint work with Irene Fonseca (Carnegie Mellon University) and Pan Liu (University of Cambridge).
joint work with Karl Kunisch
We address a learning approach for choosing regularization operators from given families of operators for the regularization of ill-posed inverse problems. Assuming that some ground truth data is given, the basic idea is to learn regularization operators by solving a bilevel optimization problem. Thereby, the lower level problem, which is depending on the regularization operators, is a Tikhonov-type regularized problem. As concrete examples we consider families of operators arising in multi-penalty Tikhonov regularization and a family of operators inspired by nonlocal energy semi-norms.
joint work with Petri Kuusela, Aku Seppänen, Tuomo Valkonen
Partial differential equations (PDE)-based inverse problems-such as image reconstruction in electrical impedance tomography (EIT)-often lead to non-convex optimization problems. These problems are also non-smooth when total variation (TV) type regularization is employed. To improve upon current optimization methods, such as (semi-smooth) Newton’s method and non-linear primal dual proximal splitting (NL-PDPS), we propose a new alternative: an inexact relaxed Gauss-Newton method. We investigate its convergence both theoretically and numerically with experimental EIT studies.