08h45 - 09h10
Solving of feasible problem in the nonsmooth case
In this talk, we present an algorithm for solving a feasible problem. More precisely, we study a variational inclusion when the set-valued part of this inclusion is a nonempty closed convex cone in a Banach space. For that, we use convex processes and we focus on the nonsmooth case, namely, when the univoque function is not necessarily differentiable but admits a first-order divided difference. An comparaison will be made with existing results.
09h10 - 09h35
Some properties for nonconvex sets
This work studies some properties of the weak subdifferential and the augmented normal cones. Weak subdifferential is based on the notion of supporting conic surfaces and plays important role in nonconvex optimization. In this work, we establish a relation between the weak subdifferential of the indicator function of a nonconvex set and the augmented normal cone. This relation is used to investigate some properties for nonconvex sets.
09h35 - 10h00
Ghost penalties and vanishing stepsizes in nonconvex constrained optimization
We consider Diminishing Stepsize Methods (DSMs) for the solution of constrained, nonconvex optimization problems. While DSMs have been introduced in the context of unconstrained, convex problems, they turn out to be much harder to analyze in constrained, nonconvex settings, and, to date, only partial results are available. This paper aims at filling this gap, completing so the analysis of DSMs. Specifically, we devise a method that, along with the diminishing step-size procedure, systematically generates directions that are solutions of suitable convex approximations of the original problem. Our approach, allowing for a “virtual” use, as mere theoretical tool, of an exact penalty function, shows convergence to generalized stationary points of the original problem. Some global complexity results are also reported.