13h30 - 13h55
A two-phase projected gradient algorithm for quadratic programming problems with bounds on the variables and a single linear constraint
Moving from the GPCG algorithm for Bound-constrained convex Quadratic Programming (BQP) problems by Moré and Toraldo [SIAM J. Optim., 1, 1991], we propose an algorithmic framework for the solution of BQP problems with a single linear equality constraint , which alternates between two phases: an identification phase, which performs gradient projection steps aimed at identifying a suitable face to be explored, and a minimization phase, which reduces the objective function in such face. In deciding when to terminate the optimization phase we use a criteria of proportionality based on a ratio between a measure of optimality within the reduced space and a measure of the violation of the KKT conditions, which extends the ideas of Dostàl [SIAM J. Optim., 3, 1997] and Friedlander and Martinez [SIAM J. Optim., 4, 1994] for BQP.
13h55 - 14h20
Let's Make Block Coordinate Descent Go Fast!
Block coordinate descent (BCD) methods are widely-used for large-scale numerical optimization because of their cheap iteration costs, low memory requirements, and amenability to parallelization. Three main algorithmic choices influence the performance of BCD methods; the block partitioning strategy, the block selection rule, and the block update rule. We explore these three building blocks and propose variations for each that can lead to significantly faster BCD methods. We support all of our findings with numerical results for the classic machine learning problems of logistic regression, multi-class logistic regression, support vector machines, and label propagation.
14h20 - 14h45
New active-set Frank-Wolfe variants for minimization over the simplex and the l1-ball
An active-set algorithmic framework is proposed for minimizing a function over the simplex. It combines an active-set estimate (to identify the zero-variables at the stationary point) with Frank-Wolfe-type directions. Two steps are performed at each iteration: first, a decrease in the objective function is obtained by setting the active variables to zero and updating one non-active variable; then, a search direction is computed in the subspace of the non-active variables. Global convergence is proved and a linear convergence rate is guaranteed under additional assumptions. Then, the algorithm is extended to solve minimization problems over the l1-ball. Numerical results are provided.
14h45 - 15h10
Quasi-Newton based preconditioning techniques for Nonlinear Conjugate Gradient Methods.
In this work we study new preconditioning strategies for Nonlinear Conjugate Gradient (NCG) methods, in large scale unconstrained optimization. We propose some novel preconditioners constructed by collecting information from the NCG iterations, in order to generate an approximate inverse of the Hessian matrix. To this aim, we convey information from quasi-Newton updates. In particular, at any iteration of the NCG we consider preconditioners based on new low-rank quasi-Newton symmetric updating formulae, obtained as by-product of the NCG iterations. An extensive numerical experience is provided on a large selection of CUTEst test problems.