Hopfield model

Replica symmetry breaking in neural networks: A few steps toward rigorous results

In this paper we adapt the broken replica interpolation technique (developed by Francesco Guerra to deal with the Sherrington-Kirkpatrick model, namely a pairwise mean-field spin-glass whose couplings are i.i.d. standard Gaussian variables) in order to work also with the Hopfield model (i.e. a pairwise mean-field neural-network whose couplings are drawn according to Hebb's learning rule): This is accomplished by grafting Guerra's telescopic averages on the transport equation technique, recently developed by some of the authors.

Boltzmann machines as generalized hopfield networks: A review of recent results and outlooks

The Hopfield model and the Boltzmann machine are among the most popular examples of neural networks. The latter, widely used for classification and feature detection, is able to efficiently learn a generative model from observed data and constitutes the benchmark for statistical learning. The former, designed to mimic the retrieval phase of an artificial associative memory lays in between two paradigmatic statistical mechanics models, namely the Curie-Weiss and the Sherrington-Kirkpatrick, which are recovered as the limiting cases of, respectively, one and many stored memories.

© Università degli Studi di Roma "La Sapienza" - Piazzale Aldo Moro 5, 00185 Roma