Journées de l'optimisation 2016

HEC Montréal, Québec, Canada, 2 — 4 mai 2016

Horaire Auteurs Mon horaire

WA9 Derivative-Free Optimization

4 mai 2016 10h30 – 12h10

Salle: Pricewaterhouse Coopers

Présidée par Charles Audet

4 présentations

  • 10h30 - 10h55

    Granularity of variables in direct search algorithms

    • Charles Audet, prés., GERAD - Polytechnique Montréal
    • Sébastien Le Digabel, GERAD, Polytechnique Montréal
    • Christophe Tribes, Polytechnique Montréal

    We study blackbox optimization in which there is a minimal granularity on some or all variables. Integers are an example in which the granularity equals one. The situation in which a variable has a fixed number decimals is another. We propose a new way to update the mesh size vector from one iteration to another in the MADS algorithm. The new strategy harmonizes the minimal granularity of variables with the finest mesh containing all trial points. Another feature of our proposed approach is that the number of decimals in the trial points is controlled.

  • 10h55 - 11h20

    Using inexact subgradients to compute proximal points of convex functions

    • Warren Hare, prés., UBC
    • Chayne Planiden, UBC

    Proximal points play a central role in a number of non-smooth optimization algorithms. Some recent work has extended these algorithms to a DFO setting. However, past work has focused on extending entire (complicated) algorithms, and any subroutine to compute a proximate point is hidden within the developed method, and only analyzed in light of the developed method. In this work, we develop such an inexact bundle method to find the proximal point of a convex function at a given point. This method can now be used as a foundation in proximal-based methods for non-smooth convex functions where the oracle returns an exact function value and an inexact subgradient vector.

  • 11h20 - 11h45

    Efficient subproblem solution within MADS with quadratic models

    • Charles Audet, GERAD - Polytechnique Montréal
    • Andrew R. Conn, IBM Research
    • Sébastien Le Digabel, GERAD, Polytechnique Montréal
    • Nadir Amaioua, prés., Polytechnique Montréal

    A subproblem appears when using the MADS (Mesh Adaptive Direct Search) algorithm with quadratic models, for derivative-free optimization. In this context, we explore different algorithms that exploit the structure of the subproblem: The augmented Lagrangian method, the L1 exact penalty function and the augmented Lagrangian with a L1 penalty term. We implement these algorithms within the NOMAD software package and present their impact on the quality of the solutions.

  • 11h45 - 12h10

    Ensembles of surrogates for derivative-free optimization

    • Bastien Talgorn, prés., Université McGill
    • Charles Audet, GERAD - Polytechnique Montréal
    • Michael Kokkolaras, Université McGill
    • Sébastien Le Digabel, GERAD, Polytechnique Montréal

    We investigate surrogate-based methods to improve the efficiency of the Mesh Adaptive Direct Search (MADS) derivative-free optimization algorithm. In particular, we build an ensemble of surrogate models and investigate several approaches to select the best model. To do so, we introduce the Order Error which is designed specifically to detect the best-suited models. These methods are tested on 10 analytical problems and on 2 real applications in aeronautics.

Retour