joint work with Silvia Bonettini, Marco Prato
We present an inexact variable metric linesearch based forward-backward method for minimizing the sum of an analytic function and a subanalytic convex term. Upon the concept of forward-backward envelope, we build a continuously differentiable surrogate function, coinciding with the objective function at every stationary point and satisfying a relative error condition which might fail for the objective function. By exploiting the Kurdyka-Lojasiewicz property on the surrogate, we prove convergence of the iterates to a stationary point, as well as the convergence rates for the function values.
A block decomposition method is proposed for minimizing a non-convex continuously differentiable function subject to one linear equality constraint and simple bounds on the variables. The working set is computed according to an almost cyclic strategy that does not use first-order information, so that the whole gradient of the objective function does not need to be computed during the algorithm. Under an appropriate assumption on the level set, global convergence to stationary points is established by using different line search strategies. Numerical results are finally provided.
We mainly consider solving homogeneous linear inequalities. We formulate a non-convex optimization problem in a slightly lifted space, whose critical points map to feasible solutions of the linear inequalities. We show various properties of the non-convex objective function and develop an algorithm that computes critical points thereof, thus yielding an algorithm for linear inequalities. We establish convergence guarantees for the algorithm and further investigate its performance via computational experiments.