Optimization Days 2024
HEC Montréal, Québec, Canada, 6 — 8 May 2024

MB7 - Derivative-Free and Blackbox Optimization II
May 6, 2024 03:30 PM – 05:10 PM
Location: Quebecor (yellow)
Chaired by Youssef Diouane
4 Presentations
-
03:30 PM - 03:55 PM
Blackbox optimization for origami-inspired multistable structures
Multistable mechanical systems exhibit more than one stable configuration where the elastic energy is locally minimized. The art of origami has been proposed as a versatile platform to design deployable structures with both compact and functional stable states.
Conceptually, a multistable origami motif is constructed of two-dimensional surfaces connected by one-dimensional fold lines, leading to ideal stable configurations exhibiting zero-energy local minima. Physically, origami-inspired structures are three-dimensional, comprising facets and hinges fabricated in a distinct stable state. This leads to the dominance of one stable state over others.
To improve mechanical performance, one can solve the constrained optimization problem of maximizing the amount of elastic energy required to switch between stable states.
The Mesh Adaptive Direct Search (MADS) algorithm is used to solve the optimization problem. Initially, the bistable waterbomb-base origami motif is selected as a case-study to present the methodology. The elastic energy of this origami pattern under deployment is calculated via Finite Element (FE) simulations which serve as the blackbox in the optimization loop.
To validate the results, optimized waterbomb-base geometries are built via Fused Filament Fabrication and tested experimentally on a Uniaxial Test Machine.
The methodology is then extended to more complex origami patterns, as well as functional origami-inspired structures. -
03:55 PM - 04:20 PM
Hierarchically constrained multi-fidelity blackbox optimization
This work introduces a novel multi-fidelity blackbox optimization algorithm
designed to alleviate the resource-intensive task of evaluating infeasible points. This algorithm
is an intermediary component bridging a direct search solver and a blackbox,
resulting in reduced computation time per evaluation, all while preserving the efficiency
and convergence properties of the chosen solver. This is made possible by assessing
feasibility through a broad range of fidelities, leveraging information from cost-effective
evaluations before committing to a full computation. These feasibility estimations are
generated through a hierarchical evaluation of constraints, tailored to the multi-fidelity
nature of the blackbox problem, and defined by a biadjacency matrix, for which we propose
a construction. A series of computational tests using the NOMAD solver on the solar
family of blackbox problems are conducted to validate the approach. The results show a
significant improvement in solution quality when an initial feasible starting point is known
in advance of the optimization process. When this condition is not met, the outcomes are
contingent upon certain properties of the blackbox. -
04:20 PM - 04:45 PM
An optimized tabu search algorithm
The effectiveness and efficiency of a metaheuristic algorithm heavily depends on its hyperparameters, particularly in addressing large-scale and real-world problems. Identifying optimal values for these hyperparameters typically cannot be achieved through trial and error alone; instead, efficient methods for hyperparameter tuning are essential to achieve the best outcomes. In this study, we demonstrate the efficacy of utilizing blackbox optimization to effectively select parameters for tabu search algorithm. Our experiments leverage the Mesh Adaptive Direct Search (MADS) algorithm, and we will present the findings from these experiments.
-
04:45 PM - 05:10 PM
A direct-search method for decentralized derivative-free optimization
Derivative-free algorithms are particularly useful to tackle optimization problems in which the function to optimize arises from complex, expensive procedures that prevent from applying classical, derivative-based algorithms. In data-driven systems, this can manifest by having to perform calculations stored on different computer nodes, which significantly hardens the optimization process.
In this talk, we will present a direct-search framework for minimizing a sum of functions that are dispatched across a network of communicating agents. The proposed method combines the use of positive spanning sets, a standard
feature of direct-search techniques, together with stepsize sequences that are prone to decentralized optimization. We establish global convergence of the proposed algorithm, and provide a numerical comparison illustrating
the benefit of our approach over strategies that explicitly attempt to approximate derivatives.