Nome e qualifica del proponente del progetto: 
sb_p_2709234
Anno: 
2021
Abstract: 

Many problems arising in the fields of engineering, management, machine learning give rise to specific optimization problems. To solve these problems, algorithms must be designed which are tailored to the peculiarities of the problem itself. To solve such problems, two main classes of algorithms are to be considered.
- Derivative-free methods
- Methods for large-scale optimization

"Derivative-free methods"
These methods do not require the use of any analytical expression of the objective and constraint functions of the problem. They are the methods of choice to solve simulation based optimization problems. Those complex problems that are characterized by the fact that there exists only a ¿black-box¿ (i.e. a simulation tool or an approximation technique) which computes objective and constraint functions value. In recent years, many derivative-free methods have been proposed for the solution of simulation-based problems that present specific difficulties or peculiarities. Among such features, it can be mentioned the fact that the values of the objective and constraint functions can be computed with different precision (fidelity) or that they are affected by random noise (following an unknown probability distribution); furthermore, some (or all) of the variables could be restricted to take integer values.

"Methods for large-scale optimization"
These class of methods is devoted to the solution of large-scale problems, i.e. those problems that have an (extremely) high number of variables and/or constraint functions. The recent developments in the field of data mining and machine learning require us to be able to deal with large scale problems in which the particular structure of the objective function make them very difficult to minimize. The proposed activity will study truncated Newton-type methods and how they can be adapted to take into account the previous computational difficulties.

ERC: 
PE1_19
PE1_20
PE6_7
Componenti gruppo di ricerca: 
sb_cp_is_3489790
sb_cp_is_3489792
sb_cp_is_3491668
sb_cp_is_3461978
sb_cp_is_3528573
sb_cp_is_3457543
sb_cp_is_3494836
sb_cp_es_463922
sb_cp_es_463921
sb_cp_es_463923
Innovatività: 

The research group carries out an extensive research activity on nonlinear optimization methods and has a consolidated experience on the use of these methods to tackle new difficult real world problems. The characteristics of the research group are perfectly reflected in the proposed project where recent instances of real world problems need further research activity on particular new methodologies.
Derivative-free methods
The study and the definition of methods for Black Box Optimization Problems is one of the main research arguments of the group. Furthermore, due to its consolidated collaborations, the group has great experience in applying optimization algorithms for the optimal designs of electrical motors/electrical magnetic apparatus, the optimal ship design problems and the health care service management. This methodological and applied research activities can be a sound basis for the following objectives:

- the definition of new algorithms for multifidelity optimization problems. By extending the results proposed in [1] or [3] it should be possible to define algorithms with stronger theoretical properties and better computational performances with respect to the few algorithms proposed in the literature in this framework. The idea is that the proposed algorithms should efficiently select the fidelity used to compute the objective/constraint functions, starting from the lowest-fidelity and moving to a higher fidelities as it converges towards the minimum points. Such new algorithms could efficiently be used to solve the new multifidelity optimization algorithms for the optimal ship design problems which need high-fidelity computational tools, especially for innovative configurations and extreme/off-design conditions.

- the definition of algorithms for mixed integer nonsmooth optimization problems. The frameworks proposed in [2],[3] and [5] could be the starting points to define efficient and globally convergent algorithm for these difficult class of problems. Few attempts to define such algorithms have been proposed in the literature. In any case, none of these is able to efficiently deal with both the difficulty that the objective functions and the constraints are nonsmooth with respect to the continuous variables and the difficulty that the number of the discrete variables is not small.

- the definition of new algorithms for stochastic optimization problems. The linesearch and trust region approaches proposed in [3] and [4] can be adapted in order to force the convergence of the procedure to a stationary point even if the evaluations of the objective/constraint functions are affected by random noise.

Methods for Large Scale Optimization
In this field, the research group can take advantage of the research activity carried out in previous years and concerning the large-scale unconstrained optimization methods and training algorithms for machine learning systems. The aims of the proposed activity are:
- the definitions of new algorithms for large scale unconstrained optimization problems. The idea is to exploit and adapt the techniques proposed in [6], [7] in order to tackle the difficult class of problems where some/all components of the objective functions gradients are very small in a large regions and where the whole gradients are not computationally available.
- the use of the new large scale unconstrained optimization algorithms for training complex machine learning system needed by recent data mining problems.

[1] Liuzzi, Lucidi, Rinaldi (2016). A derivative-free approach to constrained multiobjective nonsmooth optimization. SIAM J. on Optim., 26,
[2] Liuzzi, Lucidi, Rinaldi (2015). Derivative-Free Methods For Mixed-Integer Constrained Optimization Problems. JOTA, 164,
[3] Fasano, Liuzzi, Lucidi, Rinaldi (2014). A Linesearch-Based Derivative-Free Approach For Nonsmooth Constrained Optimization. SIAM J. on Optim., 24
[4] Liuzzi. Lucidi, Rinaldi, Vicente (2019). Trust-region methods for the derivative-free optimization of nonsmooth black-box functions, SIAM J. on Optim., 26
[5] Liuzzi, Lucidi, Rinaldi (2020). An algorithmic framework based on primitive directions and nonmonotone line searches for black-box optimization problems with integer variables. Math. Prog. Computation.
[6] Fasano, Lucidi (2009). A nonmonotone truncated Newton-Krylov method exploiting negative curvature directions, for large scale unconstrained optimization. Opt. Letter, 3.
[7] Al-Baali, Caliciotti, Fasano, Roma (2020). A Class of Approximate Inverse Preconditioners Based on Krylov-Subspace Methods, for Large-Scale Nonconvex Optimization, SIAM J. on Optim., 26

Codice Bando: 
2709234

© Università degli Studi di Roma "La Sapienza" - Piazzale Aldo Moro 5, 00185 Roma