Journées de l'optimisation 2022

HEC Montréal, Québec, Canada, 16 — 18 mai 2022

Horaire Auteurs Mon horaire

TA8 - Derivative-free and Blackbox Optimization III

17 mai 2022 10h30 – 12h10

Salle: METRO INC. (jaune)

Présidée par Ludovic Salomon

4 présentations

  • 10h30 - 10h55

    Bi-fidelity Surrogate Modelling: Showcasing the need for new test instances

    • Nicolau Andres-Thio, prés., The University of Melbourne
    • Kate Smith-Miles, The University of Melbourne
    • Mario Andres Munoz, The University of Melbourne

    In recent years, Multi-fidelity Expensive Black-Box (Mf-EBB) methods have received increasing attention due to their strong applicability to industrial design problems. In the field of Mf-EBB, a problem instance consists of an expensive yet accurate source of information, and one or more cheap yet less accurate sources of information. The field aims to provide techniques either to accurately explain how decisions affect design outcome, or to find the best decisions to optimise design outcomes. Many techniques which use surrogate models have been developed to provide solutions to both aims. Only in recent years, however, have researchers begun to explore the conditions under which these new techniques are reliable, often focusing on problems with a single low-fidelity function, known as Bi-fidelity Expensive Black-Box (Bf-EBB) problems. This presentation will discuss new developments in creation and analysis of instance in this field. It will do so by showing the potentially misleading results that could be reached using only the instances currently found in the literature, and exposing the criticality of a more heterogeneous test suite for algorithm assessment. Newly proposed instances and features will be presented and their importance in algorithm analysis will be discussed.

  • 10h55 - 11h20

    Combining Cross Entropy and MADS methods for inequality constrained global optimization

    • Romain Couderc, prés.,
    • Charles Audet, GERAD - Polytechnique Montréal
    • Jean Bigeon, Univ. Grenoble Alpes, CNRS, Grenoble INP, G-SCOP, F-38000 Grenoble, France

    This presentation proposes a way to combine the Mesh Adaptive Direct Search (MADS) algorithm with the Cross-Entropy (CE) method for nonsmooth constrained optimization. The CE method is used as an exploration step by the MADS algorithm. The result of this combination retains the convergence properties of MADS and allows an efficient exploration in order to move away from local minima. The CE method samples trial points according to a multivariate normal distribution whose mean and standard deviation are calculated from the best points found so far. Numerical experiments show the efficiency of this method compared to other global optimization heuristics. Moreover, applied on complex engineering test problems, this method allows an important improvement to reach the feasible region and to escape local minima.

  • 11h20 - 11h45

    Survey Descent: A Multipoint Generalization of Gradient Descent for Nonsmooth Optimization

    • X.Y. Han, prés., Cornell University
    • Adrian Lewis, Cornell University

    For strongly convex objectives that are smooth, the classical theory of gradient descent ensures linear convergence relative to the number of gradient evaluations. An analogous nonsmooth theory is challenging: even when the objective is smooth at every iterate, the corresponding local models are unstable, and traditional remedies need unpredictably many cutting planes. We instead propose a multipoint generalization of the gradient descent iteration for local optimization. While designed with general objectives in mind, we are motivated by a "max-of-smooth" model that captures the subdifferential dimension at optimality. We prove linear convergence when the objective is itself max-of-smooth, and experiments suggest a more general phenomenon.

  • 11h45 - 12h10

    A study of quadratic search step formulations for multiobjective derivative free optimization

    • Ludovic Salomon, prés., Polytechnique Montréal
    • Sébastien Le Digabel, GERAD, Polytechnique Montréal
    • Jean Bigeon, Univ. Grenoble Alpes, CNRS, Grenoble INP, G-SCOP, F-38000 Grenoble, France

    Many engineering applications involve the optimization of several contradictory criteria and do not possess an explicit algebraic structure which could be exploited. DMulti-MADS is a multiobjective derivative free optimization algorithm which targets these types of problems. It is based on the Mesh Adaptive Direct Search (MADS) algorithm for single-objective optimization. As a direct search method, its functioning is divided into two parts: a poll and a search. The poll consists in a local exploration around a solution, on which convergence analysis depends. The search is an optional step, more flexible, which practically enhances the performance of the procedure. This talk proposes the integration of quadratic models into DMulti-MADS to improve its performance. New formulations are explored and compared with existing ones. Preliminary numerical results show that these approaches improve the performance of the method, compared to other state-of-the-art algorithms.