HEC Montréal, Canada, 6 - 8 mai 2013

Journées de l'optimisation 2013

HEC Montréal, Canada, 6 — 8 mai 2013

Horaire Auteurs Mon horaire

MB5 Optimisation sans derivées II / Derivative-Free Optimization II

6 mai 2013 15h30 – 17h10

Salle: CPA du Québec

Présidée par Charles Audet

4 présentations

  • 15h30 - 15h55

    Approximating Normal Cones for Constrained Optimization

    • Warren Hare, prés., UBC

    Normal cones provide powerful information about projections, tangent directions, and stopping conditions in constrained optimization. When the constraint set is defined through a collection of (well-behaved) analytic functions, normal cones are easily computed. In this talk we consider the situation where the constraint set is provided through an oracle function or collection of oracle functions. Methods for approximating normal cones under these conditions are provided and compared.

  • 15h55 - 16h20

    Directional Direct-Search Optimization with Polling Directions Based on Equal Angle Distributions

    • Thomas Asaki, prés., Washington State University
    • Benjamin Van Dyke, Washington State University

    We consider new instances of MADS/GSS algorithms which emphasize uniform distributions of search directions. We utilize minimal or maximal positive bases having equal, or nearly equal, angle distributions. The goal is performance enhancement for high-dimensional constrained problems. Results and comparisons are presented for a variety of test problems.

  • 16h20 - 16h45

    Pairing Derivative-Free Optimization and Sensitivity Analysis Using a Hybrid Framework

    • Genetha Gray, prés., Sandia National Labs
    • John Siirola, Sandia National Labs

    Because each optimization method has inherent strengths and weaknesses, picking a suitable algorithm is quite challenging and has been the subject of many studies and much debate. In order to take advantage of the benefits of more than one approach and to try to overcome their shortcomings, two or more methods may be combined, forming a hybrid. In this talk, we will discuss a hybrid software framework and give some examples of its use. We will also explain how it can be used to incorporate sensitivity studies into a derivative-free optimization process in order to describe some of the uncertainties associated the suggested solutions.

  • 16h45 - 17h10

    Reducing the Number of Function Evaluations in Mesh Adaptive Direct Search Algorithms

    • Charles Audet, prés., GERAD - Polytechnique Montréal
    • Andrea Ianni, Università di Roma La Sapienza
    • Sébastien Le Digabel, GERAD, Polytechnique Montréal
    • Christophe Tribes, Polytechnique Montréal

    We propose strategies to improve the efficiency of MADS algorithms by reducing the maximal number of trial points at each iteration without impacting the quality of the solution. We devise various strategies, embedded in a generic algorithmic framework, that order the trial points in such a way that the promising points are evaluated first, and the unpromising points are discarded and replaced by a single point. A crucial element is that the proposed methods retain the hierarchical nonsmooth convergence analysis.