HEC Montréal, Canada, May 6  8, 2013
2013 Optimization Days
HEC Montréal, Canada, 6 — 8 May 2013
MA10 Algèbre linéaire en optimisation / The Linear Algebra of Optimization
May 6, 2013 10:30 AM – 12:10 PM
Location: Nancy et MichelGaucher
Chaired by Ahad Dehghani
4 Presentations

10:30 AM  10:55 AM
Linearizing the Method of Conjugate Gradients
The method of conjugate gradients (CG) is widely used for the iterative solution of large sparse systems of equations Ax=b, where A is symmetric positive definite. We present an expression for the Jacobian matrix of a CG iterate with respect to b. We discuss data assimilation applications in which these ideas are used for firstorder propagation of covariance matrices.

10:55 AM  11:20 AM
A PrimalDual Regularized InteriorPoint Method for Semidefinite Programming
Interiorpoint methods in semidefinite programming (SDP) require the solution of a sequence of linear systems which are used to derive the search directions. Safeguards are typically required in order to handle rankdeficient Jacobians and free variables. We show that it is possible to recover an optimal solution of the original primaldual pair via inaccurate solves of a sequence of regularized SDPs for both the NT and dual HKM directions. Benefits of our approach include increased robustness and a simpler implementation. Our method does not require the constraints to be linearly independent and does not assume that Slater's condition holds. We report numerical experience on standard problems that illustrate our findings.

11:20 AM  11:45 AM
A Regularized InteriorPoint Method for Constrained Linear Least Squares
We propose an infeasible interiorpoint algorithm for constrained linear leastsquares problems based on the primaldual regularization of convex programs of Friedlander and Orban (2012). At each iteration, the sparse LDL factorization of a symmetric quasidefinite matrix is computed. This coefficient matrix is shown to be uniformly bounded and nonsingular. We establish conditions under which a solution of the original problem is recovered. The regularization allows us to dispense with the assumption that the active gradients are linearly independent. Although the implementation described here is factorization based, it paves the way to a matrixfree implementation in which a regularized unconstrained linear leastsquares problem is solved at each iteration. We report on computational experience and illustrate the potential advantages of our approach.

11:45 AM  12:10 PM
Projected Krylov Methods for SaddlePoint Systems
Projected Krylov methods are fullspace formulations of Krylov methods that take place in a nullspace. Provided projections into the nullspace can be computed accurately, those methods only require products between an operator and vectors lying in the nullspace. In the symmetric case, their convergence is thus entirely described by the spectrum of the (preconditioned) operator restricted to the nullspace. We provide systematic principles for obtaining the projected form of any welldefined Krylov method. Equivalence properties between projected Krylov methods and standard Krylov methods applied to a saddlepoint operator with a constraint preconditioner allow us to show that, contrary to common belief, certain known methods such as MINRES and SYMMLQ are well defined in the presence of an indefinite preconditioner.