Dreaming neural networks: Rigorous results

01 Pubblicazione su rivista
Agliari E., Alemanno F., Barra A., Fachechi A.
ISSN: 1742-5468

Recently, a daily routine for associative neural networks has been proposed: the network Hebbian-learns during the awake state (thus behaving as a standard Hopfield model), then, during its sleep state, it consolidates pure patterns and removes spurious ones, optimizing information storage: this forces the synaptic matrix to collapse to the projector one (ultimately approaching the Kanter-Sompolinsky model), allowing for the maximal critical capacity (for symmetric interactions). So far, this emerging picture (as well as the bulk of papers on unlearning techniques) was supported solely by non-rigorous routes, e.g. replica-trick analysis and numerical simulations, while here we rely on Guerra's interpolation techniques and we extend the generalized stochastic stability approach to the case. Focusing on the replica-symmetric scenario (where the previous investigations lie), the former picture is entirely confirmed. Further, we develop a fluctuation analysis to check where ergodicity is broken (an analysis absent in previous investigations). Remarkably, we find that, as long as the network is awake, ergodicity is bounded by the Amit-Gutfreund-Sompolinsky critical line (as it should), but, as the network sleeps, spin-glass states are destroyed and both the retrieval and the ergodic region get wider. Thus, after a whole sleeping session, the solely surviving regions are the retrieval and ergodic ones, in such a way that the network achieves a perfect retrieval regime.

© Università degli Studi di Roma "La Sapienza" - Piazzale Aldo Moro 5, 00185 Roma