15h30 - 15h55
Large scale taxi routing in Julia
We study the taxi-routing problem and solve large scale instances with real demand data using Julia. We combine scalable mixed-integer formulations, randomized heuristics and decision under uncertainty to optimize the routing decisions of Manhattan's yellow cabs. We explore the necessary trade-offs to extend traditional OR techniques to large instances with real demand data. All optimization formulations, data wrangling, algorithms and visualizations are coded in Julia.
15h55 - 16h20
Mixed-integer convex optimization
We present a new framework for mixed-integer convex optimization based on polyhedral outer approximation in an extended set of variables. The generation of extended formulations is automated based on the modeling framework of disciplined convex programming. This new approach has resulted in the solution of open benchmark instances from the MINLPLIB2 benchmark library. The solver, Pajarito, is implemented in Julia and was recently released as open-source software.
16h20 - 16h45
Optimal survey design: Nonconvex MINLP in Couenne vs. MIQCP reformulation in CPLEX
We propose and solve a novel optimal survey design model that integrates learning and optimization by implicitly modeling the 'exploration vs. exploitation' trade-off in this multistage adaptive setting. The natural formulation possesses nonconvex constraints with binary-continuous bilinear terms. Compounding the challenge of these nonconvex constraints, many of the bilinear terms are unbounded. This MINLP formulation can be given directly to Couenne, a global solver. However, by using McCormick linearizations on the bilinear terms, and clever techniques (which require a feasible solution) for bounding the unbounded variables, an MIQCP formulation can be solved with CPLEX. Alternatively, rather than searching for bounds and writing McCormick constraints, we could model the bilinear terms with CPLEX's 'indicator constraints', and tell CPLEX to use 'local implied bound cuts', a relatively new feature. We implement these three formulations in JuMP and compare performance.