We suggest simple implementable modifications of gradient type methods for smooth convex optimization problems in Hilbert spaces, which may be ill-posed. At each iteration, the selected method is applied to some perturbed optimization problem, but we change the perturbation only after satisfying a simple estimate inequality, which allows us to utilize mild rules for the choice of the parameters. Within these rules we prove strong convergence and establish complexity estimates for the methods. Preliminary results of computational tests confirm efficiency of the proposed modification.
A general framework for fractional dimensional optimization method is introduced. It is shown that there is a tangent line of function in fractional dimensions which directly passes through the root. The method is presented to investigate a possibility of direct solution of optimization problem, however; analysis of fractional dimension needs further investigation. Beside method has a possibility to use for convex and non-convex and also local and global optimization problems. As a point, fractional dimensional optimization method opens a different view and direction for optimization problem.
joint work with David Müller, Yurii Nesterov
We derive new prox-functions on the simplex from additive random utility models of discrete choice. They are convex conjugates of the corresponding surplus functions. We explicitly derive the convexity parameter of discrete choice prox-functions associated with generalized extreme value models, and specifically with generalized nested logit models. Incorporated into subgradient schemes, discrete choice prox-functions lead to natural probabilistic interpretations of the iteration steps. As illustration we discuss an economic application of discrete choice prox-functions in consumption cycle.