Neurodynamics

Anders Sandberg, asa@nada.kth.se

My lecture notes on Hopfield networks (PostScript)

My lecture notes on Optimization and Boltzmann machines (PostScript)

Reading instructions for Haykin

*** = Important
** = Intermediate
* = Background or for pleasure only
11.1 Introduction *
11.2 Statistical Mechanics *
11.3 Markov chains *
11.4 Metropolis algorithm **
11.5 Simulated annealing ***
11.6 Gibbs Sampling **
11.7 Boltzmann machine ***
11.8 Sigmoid belief networks **
11.9 Helmholtz machine **
11.10 Mean field theory **
11.11 Deterministic Boltzmann Machine *
11.12 Deterministic Sigmoid *
11.13 Deterministic Annealing *
11.14 Summary *
14.1 Introduction *
14.2 Dynamical systems **
14.3 Stability of Equilibrium states **
14.4 Attractors *
14.5 Neurodynamical Models **
14.6 Manipulation of Attractors **
14.7 Hopfield Model ***
14.8 Computer experiment *
14.9 Cohen-Grossberg Theorem ***
14.10 Brain-state-in-a-box **
14.11 Computer experiment *
14.12 Strange attractors and chaos *
14.13 Dynamic reconstruction *
14.14 Computer experiment *
14.15 Summary **

Links

There are plenty of demonstrations on the net of the Hopfield net.
http://www.phy.syr.edu/courses/PHY307.96Fall/Hopfield_demo/hop.html
http://diwww.epfl.ch/lami/team/michel/neural/tutorial/english/hopfield/applet.html

The Boltzmann Machine: Necker Cube Example. http://psy.uq.edu.au/~mav/java/Necker.html

Simulated Annealing http://chem1.nrl.navy.mil/~shaffer/optsa.html

Boltzmann Machines (short note) http://www.cs.indiana.edu/hyplan/gasser/C463_4/boltzmann.html

Course notes about the Hopfield network, Boltzmann-machines och Brain-State-in-a-Box from a course at McMaster university http://www.psychology.mcmaster.ca/3W03/hopfield.html, http://www.psychology.mcmaster.ca/3W03/nlam.html

Arun Jagota's lecture notes to "Introduction to the Theory of Neural Computation" by John hertz, Anders Krogh and Richard G. Palmer. http://www.phy.duke.edu/HKP/jagota/

Further Reading

"Introduction to the Theory of Neural Computation" by John hertz, Anders Krogh and Richard G. Palmer. A classic.

Modeling Brain Function by Daniel J. Amit. Another classic.

J. J. Hopfield, Neural networks and physical systems with emergent collective computational abilities, Proc Natl Acad Sci U S A 79:8 2554-8 apr 1982. The paper that started everything. From the abstract:

Computational properties of use of biological organisms or to the construction of computers can emerge as collective properties of systems having a large number of simple equivalent components (or neurons). The physical meaning of content-addressable memory is described by an appropriate phase space flow of the state of a system. A model of such a system is given, based on aspects of neurobiology but readily adapted to integrated circuits. The collective properties of this model produce a content-addressable memory which correctly yields an entire memory from any subpart of sufficient size. The algorithm for the time evolution of the state of the system is based on asynchronous parallel processing. Additional emergent collective properties include some capacity for generalization, familiarity recognition, categorization, error correction, and time sequence retention. The collective properties are only weakly sensitive to details of the modeling or the failure of individual devices.

Course home page