15th EUROPT Workshop on Advances in Continuous Optimization

Montréal, Canada, 12 — 14 juillet 2017

15th EUROPT Workshop on Advances in Continuous Optimization

Montréal, Canada, 12 — 14 juillet 2017

Horaire Auteurs Mon horaire
Cal add eabad1550a3cf3ed9646c36511a21a854fcb401e3247c61aefa77286b00fe402

Nonconvex and nonsmooth optimization

14 juil. 2017 08h45 – 10h00

Salle: TD Assurance Meloche Monnex

Présidée par Lorenzo Lampariello

3 présentations

  • Cal add eabad1550a3cf3ed9646c36511a21a854fcb401e3247c61aefa77286b00fe402
    08h45 - 09h10

    Solving of feasible problem in the nonsmooth case

    • Célia Jean-Alexis, Université des Antilles et de la Guyane

    In this talk, we present an algorithm for solving a feasible problem. More precisely, we study a variational inclusion when the set-valued part of this inclusion is a nonempty closed convex cone in a Banach space. For that, we use convex processes and we focus on the nonsmooth case, namely, when the univoque function is not necessarily differentiable but admits a first-order divided difference. An comparaison will be made with existing results.

  • Cal add eabad1550a3cf3ed9646c36511a21a854fcb401e3247c61aefa77286b00fe402
    09h10 - 09h35

    Some properties for nonconvex sets

    • samet bila, prés.,
    • Refail Kasimbeyli, Advisor

    This work studies some properties of the weak subdifferential and the augmented normal cones. Weak subdifferential is based on the notion of supporting conic surfaces and plays important role in nonconvex optimization. In this work, we establish a relation between the weak subdifferential of the indicator function of a nonconvex set and the augmented normal cone. This relation is used to investigate some properties for nonconvex sets.

  • Cal add eabad1550a3cf3ed9646c36511a21a854fcb401e3247c61aefa77286b00fe402
    09h35 - 10h00

    Ghost penalties and vanishing stepsizes in nonconvex constrained optimization

    • Lorenzo Lampariello, prés., Università degli Studi Roma Tre
    • Francisco Facchinei, Sapienza University of Rome
    • Vyacheslav Kungurtsev, Czech Technical University in Prague
    • Gesualdo Scutari, Purdue University

    We consider Diminishing Stepsize Methods (DSMs) for the solution of constrained, nonconvex optimization problems. While DSMs have been introduced in the context of unconstrained, convex problems, they turn out to be much harder to analyze in constrained, nonconvex settings, and, to date, only partial results are available. This paper aims at filling this gap, completing so the analysis of DSMs. Specifically, we devise a method that, along with the diminishing step-size procedure, systematically generates directions that are solutions of suitable convex approximations of the original problem. Our approach, allowing for a “virtual” use, as mere theoretical tool, of an exact penalty function, shows convergence to generalized stationary points of the original problem. Some global complexity results are also reported.