11:30 AM - 11:55 AM
A scaling proximal point algorithm in domains of positivity
This paper presents a proximal point method for convex minimization problems in a special class of Hadamard manifold called homogeneous domains of positivity. The method is related to the particularization of the Rham decomposition theorem for symmetric spaces in certain sense and it computes the minimum for Riemannian convex functions. Homogeneous domains of positivity to be considered in this work are those of reducible type. Some computational experiments with Riemannian averages of symmetric positive definite matrices, as DTI problems, show that the method is potentially useful in actual problems.
11:55 AM - 12:20 PM
Extragradient method in optimization: Convergence and complexity
We consider the extragradient method to minimize the sum of two functions, the first one being smooth and the second being convex. Under Kurdyka-Lojasiewicz assumption, we prove that the sequence produced by the extragradient method converges to a critical point of the problem and has finite length. The analysis is extended to the case when both functions are convex. We provide a 1/k convergence rate which is classical for gradient methods. Furthermore, we show that the recent small-prox complexity result can be applied to this method. Considering the extragradient method is the occasion to describe exact line search for proximal decomposition methods. We provide details for the implementation of this scheme for the ell_1 regularized least squares problem and give numerical results which suggest that combining nonaccelerated methods with exact line search can be a competitive choice.
12:20 PM - 12:45 PM
An augmented Lagrangian method for equality constrained optimization with rapid infeasibility detection capabilities
We present a primal-dual augmented Lagrangian method for solving an equality constrained minimization problem, which is able to rapidly detect infeasibility. A parameter is introduced to scale the objective function and, in case of infeasibility, to force the convergence of the iterates to an infeasible stationary point. It is shown, under mild assumptions, that whenever the algorithm converges to an infeasible stationary point, the rate of convergence is quadratic. The numerical experiments show that our new approach is as good as the original one when the algorithm converges to a local minimum, but much more efficient in case of infeasibility.