joint work with Mihai I. Florea
Numerous problems in signal processing are cast or approximated as large-scale convex optimization problems. Because of their size, such problems can be solved only using first-order methods. Acceleration is then becomes the main feature of the methods. We will overview some our recent results on the topic (the majorization minimization framework appears to be an umbrella for many such methods including Nesterov's FGM, FISTA, our ACGM, and others) and give a number of successful applications in signal processing (LASSO, NNLS, L1LR, RR, EN, radar waveform design with required properties).
joint work with Marius Pesavento
Block coordinate descent algorithms based on nonlinear best-response, or Gauss-Seidel algorithm, is one of the most popular algorithm in large-scale nonsmooth optimization. In this paper, we propose a parallel (Jacobi) best-response algorithm, where a stepsize is introduced to guarantee the convergence. Although the original problem is nonsmooth, we propose a simple line search method carried out over a properly designed smooth function. The proposed algorithm, evaluated for the nonsmooth nonconvex sparsity-regularized rank minimization problem, exhibts fast convergence and low complexity.
joint work with Shiqian Ma, Anthony Man-Cho So, Shuzhong Zhang
We propose a vector transport-free stochastic variance reduced gradient (TF-SVRG) method for empirical risk minimization over Riemannian manifold. The TF-R-SVRG we propose handles general retraction operations, and does not need additional vector transport evaluations as done by most existing manifold SVRG methods. We analyze the iteration complexity of TF-R-SVRG for obtaining an $\epsilon$-stationary point and its local linear convergence by assuming the Lojasiewicz inequality. We also incorporate the Barzilai-Borwein step size and design a very practical TF-R-SVRG-BB method.