machine learning

An energy-aware hardware implementation of 2D hierarchical clustering

We propose here an implementation of 2D hierarchical clustering tailored for power constrained and low-precision hardware. In many application fields such as smart sensor networks, having low computational capacity is mandatory for energy saving purposes. In this context, we aim to deploy a specific constrained hardware solution, using a parallel architecture with a low number of bits. The effectiveness of the proposed approach is corroborated by testing it on well-known 2D clustering datasets.

A parallel hardware implementation for 2D hierarchical clustering based on fuzzy logic

In this paper we propose a novel hardware implementation for a bidimensional unconstrained hierarchical clustering method, based on fuzzy logic and membership functions. Unlike classical clustering approaches, our work is based on an advanced algorithm that shows an intrinsic parallelism. Such parallelism can be exploited to design an efficient hardware implementation suitable for low-resources, low-power and highcomputational demanding applications like smart-sensors and IoT devices. We validated our design by an extensive simulation campaign on well known 2D clustering datasets.

Classification and calibration techniques in predictive maintenance: A comparison between GMM and a custom one-class classifier

Modeling and predicting failures in the field of predictive maintenance is a challenging task. An important issue of an intelligent predictive maintenance system, exploited also for Condition Based Maintenance applications, is the failure probability estimation that can be found uncalibrated for most standard and custom classifiers grounded on Machine learning.

Intelligent energy flow management of a nanogrid fast charging station equipped with second life batteries

In this paper we investigate a public Fast Charge (FC) station nanogrid equipped with a Photovoltaic (PV) system and an Energy Storage System (ESS) using second-life Electric Vehicle (EV) batteries. Since the nanogrid is intended for installation in urban areas, it is designed with a very limited connection with the grid to assure peak shaving and encourage PV autoconsumption.

Proposal and investigation of an artificial intelligence (Ai)-based cloud resource allocation algorithm in network function virtualization architectures

The high time needed to reconfigure cloud resources in Network Function Virtualization network environments has led to the proposal of solutions in which a prediction based-resource allocation is performed. All of them are based on traffic or needed resource prediction with the minimization of symmetric loss functions like Mean Squared Error. When inevitable prediction errors are made, the prediction methodologies are not able to differently weigh positive and negative prediction errors that could impact the total network cost.

Development of a data-driven model for turbulent heat transfer in turbomachinery

Machine Learning (ML) algorithms have become popular in many fields, including applications related to turbomachinery and heat transfer. The key properties of ML are the capability to partially tackle the problem of slowing down of Moore’s law and to dig-out correlations within large datasets like those available on turbomachinery. Data come from experiments and simulations with different degree of accuracy, according to the test-rig or the CFD approach.

Identification of poorly ventilated zones in gas-turbine enclosures with machine learning

Ventilation systems are used in gas turbine packages to control the air temperature, to protect electrical instrumentation and auxiliary items installed inside the enclosure and to ensure a proper dilution of potentially dangerous gas leakages. These objectives are reached only if the ventilation flow is uniformly distributed in the whole volume of the package, providing a good air flow quality as prescribed by international codes such as ISO 21789.

Transformer Networks for Trajectory Forecasting

Most recent successes on forecasting the people motion are based on LSTM models and all most recent progress has been achieved by modelling the social interaction among people and the people interaction with the scene. We question the use of the LSTM models and propose the novel use of Transformer Networks for trajectory forecasting. This is a fundamental switch from the sequential step-by-step processing of LSTMs to the only-attention-based memory mechanisms of Transformers.

Predicting the spread of COVID-19 in Italy using machine learning: Do socio-economic factors matter?

We exploit the provincial variability of COVID-19 cases registered in Italy to select the territorial predictors of the pandemic. Absent an established theoretical diffusion model, we apply machine learning to isolate, among 77 potential predictors, those that minimize the out-of-sample prediction error. We first estimate the model considering cumulative cases registered before the containment measures displayed their effects (i.e. at the peak of the epidemic in March 2020), then cases registered between the peak date and when containment measures were relaxed in early June.

© Università degli Studi di Roma "La Sapienza" - Piazzale Aldo Moro 5, 00185 Roma