2016 Optimization Days

HEC Montréal, Québec, Canada, May 2 — 4, 2016

Schedule Authors My Schedule
Cal add eabad1550a3cf3ed9646c36511a21a854fcb401e3247c61aefa77286b00fe402

WA9 Derivative-Free Optimization

May 4, 2016 10:30 AM – 12:10 PM

Location: Pricewaterhouse Coopers

Chaired by Charles Audet

4 Presentations

  • Cal add eabad1550a3cf3ed9646c36511a21a854fcb401e3247c61aefa77286b00fe402
    10:30 AM - 10:55 AM

    Granularity of variables in direct search algorithms

    • Charles Audet, presenter, GERAD - Polytechnique Montréal
    • Sebastien Le Digabel, Polytechnique Montréal
    • Christophe Tribes, Polytechnique Montréal

    We study blackbox optimization in which there is a minimal granularity on some or all variables. Integers are an example in which the granularity equals one. The situation in which a variable has a fixed number decimals is another. We propose a new way to update the mesh size vector from one iteration to another in the MADS algorithm. The new strategy harmonizes the minimal granularity of variables with the finest mesh containing all trial points. Another feature of our proposed approach is that the number of decimals in the trial points is controlled.

  • Cal add eabad1550a3cf3ed9646c36511a21a854fcb401e3247c61aefa77286b00fe402
    10:55 AM - 11:20 AM

    Using inexact subgradients to compute proximal points of convex functions

    • Warren Hare, presenter, UBC
    • Chayne Planiden, UBC

    Proximal points play a central role in a number of non-smooth optimization algorithms. Some recent work has extended these algorithms to a DFO setting. However, past work has focused on extending entire (complicated) algorithms, and any subroutine to compute a proximate point is hidden within the developed method, and only analyzed in light of the developed method. In this work, we develop such an inexact bundle method to find the proximal point of a convex function at a given point. This method can now be used as a foundation in proximal-based methods for non-smooth convex functions where the oracle returns an exact function value and an inexact subgradient vector.

  • Cal add eabad1550a3cf3ed9646c36511a21a854fcb401e3247c61aefa77286b00fe402
    11:20 AM - 11:45 AM

    Efficient subproblem solution within MADS with quadratic models

    • Charles Audet, GERAD - Polytechnique Montréal
    • Andrew R. Conn, IBM Research
    • Sebastien Le Digabel, Polytechnique Montréal
    • Nadir Amaioua, presenter, Polytechnique Montréal

    A subproblem appears when using the MADS (Mesh Adaptive Direct Search) algorithm with quadratic models, for derivative-free optimization. In this context, we explore different algorithms that exploit the structure of the subproblem: The augmented Lagrangian method, the L1 exact penalty function and the augmented Lagrangian with a L1 penalty term. We implement these algorithms within the NOMAD software package and present their impact on the quality of the solutions.

  • Cal add eabad1550a3cf3ed9646c36511a21a854fcb401e3247c61aefa77286b00fe402
    11:45 AM - 12:10 PM

    Ensembles of surrogates for derivative-free optimization

    • Bastien Talgorn, presenter, Université McGill
    • Charles Audet, GERAD - Polytechnique Montréal
    • Michael Kokkolaras, Université McGill
    • Sebastien Le Digabel, Polytechnique Montréal

    We investigate surrogate-based methods to improve the efficiency of the Mesh Adaptive Direct Search (MADS) derivative-free optimization algorithm. In particular, we build an ensemble of surrogate models and investigate several approaches to select the best model. To do so, we introduce the Order Error which is designed specifically to detect the best-suited models. These methods are tested on 10 analytical problems and on 2 real applications in aeronautics.

Back