15h30 - 17h00
Derivative-free optimization refers to the development, study, and application of algorithms to solve optimization problems without using (sub-)gradient evaluations. Applications are rich, and include almost any problem where the objective function is provided via a computer simulation. In this introductory lecture, we will aim to examine 3 popular methods: Genetic Algorithms, Pattern Search Methods, and Simplex Gradient Descent. We will highlight the strengths and weaknesses of each, and compare the three on a real-world application.
This tutorial will teach under the assumption that the audience is comfortable with: Linear Algebra (matrix manipulation), Multivariate Calculus, and Numerical Analysis (notably, Taylor's Theorem).