Journées de l'optimisation 2022

HEC Montréal, Québec, Canada, 16 — 18 mai 2022

Horaire Auteurs Mon horaire

WA5 - Numerical optimization and linear algebra with Julia II

18 mai 2022 10h30 – 12h10

Salle: Louis-Laberge (rouge)

Présidée par Alexis Montoison

4 présentations

  • 10h30 - 10h55

    Evaluating cone oracles nicely in interior point algorithms

    • Lea Kapelevich, prés., MIT
    • Erling Andersen, Mosek ApS
    • Juan Pablo Vielma, Google

    We will give a broad overview of the cone oracles required by some current conic interior point solvers. For example, the interior point solver Hypatia uses up to third order derivative information (which can be evaluated nicely for many cones). The algorithm by Dahl and Andersen (2021) uses similar information but also requires access to a derivative of the conjugate of the barrier function for the cones in the problem. In this talk we will discuss when this information is efficiently computable and how this affects the interior point algorithm.

  • 10h55 - 11h20

    A modular regularized interior-point algorithm for convex quadratic optimization in multi-precision

    • Geoffroy Leconte, prés., Polytechnique Montréal

    In this talk, we describe a Julia implementation of RipQP, a regularized interior-point method for convex quadratic optimization.
    RipQP is able to solve problems in several floating-point formats, and can also start in a lower precision as a form of warm-start, and transition to a higher precision.
    The main cost of the algorithm consists in solving a linear system.
    RipQP can use sparse factorizations, such as Cholesky or indefinite LDL, but it can also be used with Krylov methods from the Julia package Krylov.jl.
    We present a comparison of some of the best Krylov methods on different formulations of the linear system on linear and convex quadratic problems.

  • 11h20 - 11h45

    Coordinate Linear Variance Reduction for Generalized Linear Programming

    • Cheuk Yin Lin, prés., UW-Madison

    In this talk, we present Coordinate Linear Variance Reduction (CLVR; pronounced “clever”), an efficient, scalable first-order algorithm designed for the convex-concave min-max problem originated from reformulating a class of generalized linear programs (GLP), which can include simple nonsmooth convex regularizer and simple convex set constraints. When the regularization terms and constraints are separable, CLVR admits an efficient lazy update strategy that makes its complexity bounds scale with the number of nonzero elements of the linear constraint matrix in (GLP) rather than the matrix dimensions. We demonstrate the applicability of CLVR on the Distributionally Robust Optimization (DRO) problems with ambiguity sets based on both f-divergence and Wasserstein metrics, reformulating them as GLPs by introducing sparsely connected auxiliary variables. We implement our algorithm in Julia and complement our theoretical guarantees with numerical experiments that verify our algorithm’s practical effectiveness.

  • 11h45 - 12h10

    QLP factorization for dense matrices

    • Alexis Montoison, prés.,
    • Michael Saunders, Stanford University
    • Dominique Orban, GERAD - Polytechnique Montréal

    Abstract: The QLP factorization of a dense matrix A of any shape can be determined in a more efficient way than computing the QR factors of A followed by the LQ factors of R. For symmetric matrices we propose orthogonal tridiagonalization followed by QLP factorization of the resulting triadiagonal. This QLP factorization of dense symmetric matrices requires similar storage to Cholesky and LDL^T factorizations and is more stable. For unsymmetric or rectangular matrices, we propose orthogonal bidiagonalization. Both factorizations are standard for EVD and SVD computation but are also relevant for QLP factorization. Two applications are for rank estimation and determining the minimum-norm solution x that minimizes ||b - Ax|| when A in R^{m x n} and rank(A) < min(m, n).

Retour