10h30 - 10h55
Review of measures of the quality of approximated Pareto fronts in multiobjective optimization
In the recent years, the development of new algorithms for multiobjective optimization has considerably grown. To face the need to reassure the quality of approximated Pareto fronts, metrics used as performance indicators have been developed in the last two decades. They are notably used to compare multiobjective optimization algorithms . In this presentation, we propose a review of performance indicators. We first give a mathematical framework describing the indicators and classify them according to their properties. Some applications are presented as well as new opportunities for the future.
10h55 - 11h20
Dynamic and static models in derivative-free optimization
The Mesh Adaptive Direct Search algorithm (MADS) orders poll points using models. Two types of models can be used: a given static surrogate, or a dynamic model updated at each iteration. This work introduces a third type of models, which is a dynamic model that uses the static surrogate information. This new model aims at improving the efficiency of model ordering for the poll step when a static surrogate is given.
11h20 - 11h45
Mesh adaptive direct search algorithms for multifidelity optimization
Multifidelity optimization occurs in engineering design when multiple engineering simulation codes are available at different levels of fidelity. An example of this might be in aerodynamic optimization of the shape of a wing, in which the computation of aerodynamic quantities, such as lift and drag, can be computed using a full Navier-Stokes solver, or an Euler solver, or a linearized potential code. High fidelity simulations are more accurate, but also more computationally expensive. The goal of this work is the design of algorithms that optimize with respect to the high-fidelity simulation, but exploit the use of lower fidelity codes as much as possible. Two new surrogate-based mesh adaptive direct search (MADS) algorithms will be presented, in which interpolating surrogates are constructed and updated from previously evaluated iterates of the algorithm to speed convergence. The first algorithm employs a recursive Search step that optimizes a surrogate function constructed from the next lower fidelity level simulation augmented with an interpolating surrogate that accounts for the difference between adjacent levels of fidelity. The second approach is an augmentation of the optimization problem, in which the fidelity level is incorporated as a variable in the problem, and a relaxable constraint is added to force the solution to be at the highest level of fidelity. Some preliminary numerical results are presented.