15th EUROPT Workshop on Advances in Continuous Optimization

Montréal, Canada, July 12 — 14, 2017

15th EUROPT Workshop on Advances in Continuous Optimization

Montréal, Canada, July 12 — 14, 2017

Schedule Authors My Schedule
Cal add eabad1550a3cf3ed9646c36511a21a854fcb401e3247c61aefa77286b00fe402

Augmented Lagrangian and Proximal Point Methods

Jul 12, 2017 11:30 AM – 12:45 PM

Location: TD Assurance Meloche Monnex

Chaired by Paulo Oliveira

3 Presentations

  • Cal add eabad1550a3cf3ed9646c36511a21a854fcb401e3247c61aefa77286b00fe402
    11:30 AM - 11:55 AM

    A scaling proximal point algorithm in domains of positivity

    • Paulo Oliveira, presenter, Federal University of Rio de Janeiro
    • Ronaldo Gregorio, Federal Rural University of Rio de Janeiro
    • Charlan Alves, Federal University of Rio de Janeiro

    This paper presents a proximal point method for convex minimization problems in a special class of Hadamard manifold called homogeneous domains of positivity. The method is related to the particularization of the Rham decomposition theorem for symmetric spaces in certain sense and it computes the minimum for Riemannian convex functions. Homogeneous domains of positivity to be considered in this work are those of reducible type. Some computational experiments with Riemannian averages of symmetric positive definite matrices, as DTI problems, show that the method is potentially useful in actual problems.

  • Cal add eabad1550a3cf3ed9646c36511a21a854fcb401e3247c61aefa77286b00fe402
    11:55 AM - 12:20 PM

    Extragradient method in optimization: Convergence and complexity

    • phong nguyen, presenter, mr

    We consider the extragradient method to minimize the sum of two functions, the first one being smooth and the second being convex. Under Kurdyka-Lojasiewicz assumption, we prove that the sequence produced by the extragradient method converges to a critical point of the problem and has finite length. The analysis is extended to the case when both functions are convex. We provide a 1/k convergence rate which is classical for gradient methods. Furthermore, we show that the recent small-prox complexity result can be applied to this method. Considering the extragradient method is the occasion to describe exact line search for proximal decomposition methods. We provide details for the implementation of this scheme for the ell_1 regularized least squares problem and give numerical results which suggest that combining nonaccelerated methods with exact line search can be a competitive choice.

  • Cal add eabad1550a3cf3ed9646c36511a21a854fcb401e3247c61aefa77286b00fe402
    12:20 PM - 12:45 PM

    An augmented Lagrangian method for equality constrained optimization with rapid infeasibility detection capabilities

    • Paul Armand, Université de Limoges
    • Ngoc Nguyen Tran, presenter, Université de Limoges

    We present a primal-dual augmented Lagrangian method for solving an equality constrained minimization problem, which is able to rapidly detect infeasibility. A parameter is introduced to scale the objective function and, in case of infeasibility, to force the convergence of the iterates to an infeasible stationary point. It is shown, under mild assumptions, that whenever the algorithm converges to an infeasible stationary point, the rate of convergence is quadratic. The numerical experiments show that our new approach is as good as the original one when the algorithm converges to a local minimum, but much more efficient in case of infeasibility.

Back