18th INFORMS Computing Society (ICS) Conference

Toronto, Canada, 14 — 16 March 2025

18th INFORMS Computing Society (ICS) Conference

Toronto, Canada, 14 — 16 March 2025

Schedule Authors My Schedule

Algorithms for Nonlinear Optimization

Mar 14, 2025 04:45 PM – 06:15 PM

Location: East Common 

Chaired by Jonathan Eckstein

3 Presentations

  • 04:45 PM - 05:07 PM

    First-Order Method for Convex Smooth Function Constrained Variational Inequalities

    • Yan Wu, presenter, Clemson University
    • Yuyuan Ouyang, Clemson University
    • Qi Luo, The University of Iowa

    Constrained Variational Inequality (CVI) is a versatile optimization framework for modeling and solving complementarity and equilibrium problems. In this paper, we introduce a novel Accelerated Constrained Operator Extrapolation (ACOE) method to address single-constrained monotone variational inequality (sCMVI) problems. Our analysis demonstrates that ACOE significantly improves the convergence rate to O(1/k) with respect to both operator and constraint first-order evaluations—matching the optimal performance achieved by unconstrained variational inequality algorithms. To further address more complex multi-constrained monotone variational inequality (mCMVI) problems, we propose the Accelerated Constrained Operator Extrapolation-Sliding (ACOE-S) algorithm. This approach maintains a convergence rate of O(1/k) for operator evaluations while leveraging a sliding technique to further reduce the number of first-order evaluations required for managing multiple constraints to O(1/k^2).

  • 05:07 PM - 05:29 PM

    Fats optimization over simplex

    • Saeed Ghadimi, presenter, University of Waterloo
    • Henry Wolkowicz, University of Waterloo
    • Arnesh Sujanani, University of Waterloo

    In this work, we talk about fast and scalable methods for solving the simplex-constrained optimization problems. We first present semismooth Newton method for faster projection onto the simplex and then develop a restarted
    FISTA specialized for faster general optimization on the simplex that uses a semismooth Newton method to
    solve its subproblems involving projection. We will also talk about a projection-free method over simplex using Hadamard parametrization.

  • 05:29 PM - 05:51 PM

    Adaptive Relaxation in Approximate Augmented Lagrangian Methods, with an ADMM-like Application

    • Jonathan Eckstein, presenter, Rutgers University
    • Chang Yu, Rutgers University

    This presentation describes a new relative-error approximate augmented Lagrangian method in which the length of the multiplier adjustment step is dynamically adapted to the accuracy of the subproblem solution. We report computational results using this method as the outer loop of a two-level algorithm for solving convex optimization problems in standard Fenchel-Rockafellar form. The inner loop of the two-level algorithm is ADMM-like, but incorporates a form of Nesterov acceleration due to Chambolle and Dossal, exploiting a connection between ADMM and proximal gradient methods that will also be described.

Back