Hopfield networks

(3.4 hours to learn)

Summary

Hopfield networks are a kind of recurrent neural network which implements an associative memory. The behavior of the network can be modeled in terms of minimizing an energy function.

Context

-this concept has no prerequisites-

Goals

  • Know the rules for
    • learning the weights of a Hopfield net
    • finding a low energy configuration
  • Understand why the dynamics of the network can be described in terms of an energy function.
  • What happens if you try to store too many memories in a Hopfield net?
  • Know how Hopfield nets can be applied to optimization problems (e.g. image interpretation, Traveling Salesman)

Core resources (read/watch one of the following)

-Free-

Coursera: Neural Networks for Machine Learning (2012)
An online course by Geoff Hinton, who invented many of the core ideas behind neural nets and deep learning.
Author: Geoffrey E. Hinton
Other notes:
  • Skim the introductory networks for a general motivation of neural nets.
Information Theory, Inference, and Learning Algorithms
A graudate-level textbook on machine learning and information theory.
Author: David MacKay
Other notes:
  • The argument about variational energy and the mathematical analysis of Section 42.7 are optional.

See also

  • Boltzmann machines (and in particular, [restricted Boltzmann machines (RBMs)](restricted_boltzmann_machines) ), are a modern probabilistic analogue of Hopfield nets.
  • The mean field approximation updates in an Ising model have a similar form to Hopfield nets.