Wed.2 14:15–15:30 | H 2013 | NON
.

New Trends in Optimization Methods and Machine Learning (2/3)

Chair: El houcine Bergou Organizers: Aritra Dutta, El houcine Bergou
14:15

Aritra Dutta

joint work with Xin Li, Boqing Gong, Mubarak Shah

On a Weighted Singular Value Thresholding Algorithm

Sparse and low-rank approximation problems have many crucial applications in computer vision and recommendation systems. In this paper, we formulate and study a weighted singular value thresholding (WSVT) problem. Unlike many papers in the literature on generalized SVT methods such as generalized SVT and weighted nuclear norm minimization, we do not modify the nuclear norm in SVT. Instead, we use a weighted Frobenius norm combined with the nuclear norm. We present an algorithm to numerically solve our WSVT problem and establish the convergence of the algorithm. As a proof of concept, we extensively apply WSVT to a variety of applications, including removal of shadows, inlier detection, and background estimation and demonstrate that the weighted Frobenius norm in our WSVT can work as efficiently as the L1 norm used in the Robust PCA algorithms. Moreover, we match or outperform state-of-the-art principal component pursuit (PCP) algorithms in almost all quantitative and qualitative measures and in fractions of their execution time. This indicates that with a properly inferred weight matrix, our computationally inexpensive WSVT algorithm can serve as a natural alternative to the PCP problem.

14:40

Michel Barlaud

joint work with Antonin Chambolle, Jean-Baptiste Caillau

Robust supervised classification and feature selectionusing a primal-dual method

This paper deals with supervised classification and feature selection in high dimensional space. A classical approach is to project data on a low dimensional space and classify by minimizing an appropriate quadratic cost. It is well known that using a quadratic cost is not robust to outliers. We cope with this problem by using an $\ell_1$ norm both for the constraint and for the loss function. In this paper, we provide a novel tailored constrained primal-dual method with tconvergence proofs to compute jointly selected features and classifiers.

15:05

Jakub Mareček

Time-varying Non-convex Optimisation: Three Case Studies

We consider three prominent examples of non-convex optimisation in power systems (e.g., alternating current optimal power flow), low-rank modelling (e.g., background subtraction in computer vision), and learning linear dynamical systems (e.g., time-series forecasting). There has been a considerable recent progress in tackling such problems, via hierarchies of convexifications or otherwise, at a price of non-trivial run-time. We study the three examples in a setting, where a sequence of time-varying instances is available. We showcase tracking results and regret bounds.