algorithm

Clinical validation of 13-gene DNA methylation analysis in oral brushing samples for detection of oral carcinoma: an Italian multicenter study

The aim of this Italian multicenter study was to evaluate the diagnostic performance of a minimally invasive method for the detection of oral squamous cell carcinoma (OSCC) based on 13-gene DNA methylation analysis in oral brushing samples.

A flow-leak correction algorithm for pneumotachographic work-of-breathing measurement during high-flow nasal cannula oxygen therapy

Measuring work of breathing (WOB) is an intricate task during high-flow nasal cannula (HFNC) ther-
apy because the continuous unidirectional flow toward the patient makes pneumotachography technically
difficult to use. We implemented a new method for measuring WOB based on a differential pneumota-
chography (DP) system, equipped with one pneumotachograph inserted in the HFNC circuit and another
connected to a monitoring facemask, combined with a leak correction algorithm (LCA) that corrects flow

Particle swarm with domain partition and control assignment for time-optimal maneuvers

A novel approach has been proposed for planning time-optimal maneuvers imposing a bang–bang external control. The optimizer was based on the particle swarm optimization and only required setting the maximum number of switches allowed for each axis. Two different test cases were analyzed and solved to validate the optimizer. In the first example, characterized by four state-space variables and no path constraints, the convergence toward the optimal solution has been demonstrated with different values of the maximum number of switches.

Scientific discovery reloaded

The way scientific discovery has been conceptualized has changed drastically in the last few decades: its relation to logic, inference, methods, and evolution has been deeply reloaded. The ‘philosophical matrix’ moulded by logical empiricism and analytical tradition has been challenged by the ‘friends of discovery’, who opened up the way to a rational investigation of discovery. This has produced not only new theories of discovery (like the deductive, cognitive, and evolutionary), but also new ways of practicing it in a rational and more systematic way.

I processi decisionali interamente automatizzati nel settore assicurativo

Il presente studio mira ad analizzare le problematiche connesse all’utilizzo dei big data nel settore assicurativo, con particolare riferimento ai procedimenti decisionali interamente automatizzati. Attraverso l’intervento di sofisticati algoritmi predittivi che elaborano e valutano una moltitudine di informazioni relative al singolo utente, le imprese assicuratrici possono offrire coperture assicurative «user centric», fortemente personalizzate e adattabili alle singole differenti esigenze.

Comparison of time-domain finite-difference, finite-integration, and integral-equation methods for dipole radiation in half-space environments

In this paper we compare current implementations of commonly used numerical techniques the Finite-Difference Time-Domain (FDTD) method, the Finite-Integration Technique (FIT), and Time-Domain Integral Equations (TDIE) to solve the canonical problem of a horizontal dipole antenna radiating over lossless and lossy half-spaces. These types of environment are important starting points for simulating many Ground Penetrating Radar (GPR) applications which operate in the near-field of the antenna, where the interaction among the antenna, the ground, and targets is important.

The Parametric Architecture Project towards Cognitive Computing

The work aims to investigate what can be the possible perspectives with the use of algorithms and digital tools in creative architecture, specifically in the process of design and shape control. Moreover, shapes are determined by size and, therefore, by numbers that, accordingly, govern all art forms, including architecture. There is always a relationship between architecture and mathematics, both in conception and transcription.

The Parametric Architecture Project towards Cognitive Computing

The work aims to investigate what can be the possible perspectives with the use of algorithms and digital tools in creative architecture, specifically in the process of design and shape control. Moreover, shapes are determined by size and, therefore, by numbers that, accordingly, govern all art forms, including architecture. There is always a relationship between architecture and mathematics, both in conception and transcription.

Formiche virtuali o virtuose? Verso un'etica dell'accesso

La tecnologia definita Ant colony optimization (ACO), un sottoinsieme del più ampio settore dell’intelligenza artificiale, e in particolare della teoria degli sciami, fa tesoro del comportamento di grandi masse di utenti quando accedono a risorse di rete per ricerche sia di natura scientifica che ordinaria, quotidiana. Tramite complessi algoritmi, viene mantenuta traccia dei ‘ferormoni digitali’ rilasciati dagli utenti nel loro percorso, per poter indirizzare in modo più efficace/efficiente gli utenti successivi a ‘caccia’ dei medesimi obiettivi.

A Randomized Low Latency Resource Sharing Algorithm for Fog Computing

In this paper, we propose and report a study of a low latency resource sharing protocol for Fog Computing. The protocol has its root in the power-of-random choices family of randomization protocol. The protocol, dubbed $LL_g(T)$ is designed to cope with a not homogeneous set of nodes and dealing with a communication latency comparable with the task execution, a characteristic of time-constrained applications supported by this service delivery model.

© Università degli Studi di Roma "La Sapienza" - Piazzale Aldo Moro 5, 00185 Roma