Ghost-penalties are an innovative tool, recently intorduced by members of this research project, that can be used for the analysis of the convergence properties of optimization methods. While they share several characteristics with traditional Lyapunov functions classically used to analyze optimization algorithms, they have some distinctive features that make them extremely flexible and suitable for applications in domains where more traditional methods have not brought significant results. For example, by using ghost-penalties it was possible to give the first convergence and complexity results for diminishing stepsize methods in nonconvex optimization. Building on the experties accumulated in the past few years, this project aims at uncovering new applications for the ghost-penalty technique. In particular we plan to investigate the following topics
1) Development of the first provably convergent algorithm for nonconvex stochastic optimization problems with stochastic constraints
2) Development of the the first provably convergent distributed algorithm for nonconvex problems with nonconvex constraints
3) A complexity analysis for sequential quadratic programming methiods under realistic assumptions.
4) Applications to games and bilevel optimization
5) Application in sciences and engineering
6) Development of a computer code and its release