April 19, 2011

Great neural networks think alike

Warning for Gaussian Distributions!It is nice to see one's ideas reinvented independently - they might not be right but at least they are not entirely due to one's own random thinking.

[1104.0305] Dynamical Synapses Enhance Neural Information Processing: Mobility, Memory and Decoding by C. C. Alan Fung, K. Y. Michael Wong, He Wang, Si Wu presents a nice look at how bumps of activity in continuous attractor neural networks are affected by short term depression and short term facilitation. They find that depression destabilizes them and that facilitation enhances them, stabilizing the network response to noise.

This is pretty similar to what I did in my paper A working memory model based on fast Hebbian learning (A working memory model based on fast Hebbian learning, Network: Computation in Neural Systems, Volume 14, Issue 4, pp. 789-802 (2003).) (also turned into a chapter in my thesis). I went a bit further, and claimed working memory could be entirely dependent on very rapid Hebbian plasticity creating a suitable connection matrix. As far as I know, nobody has ever found anything like that. Synaptic facilitation is all we got for the moment to work with.

The fun part with Fung at al. is that they demonstrate (both analytically and experimentally; I wish I had that analytical treatment of my model!) just how the adaptation dynamic can serve to change the bump behaviour. They point out that modulation of the adaptation could make different regions (or the same region during different tasks) handle them differently. In the light of the adaptation models I did in my thesis, this fits in very nicely: it is important to not just be able to imprint stable attractors in the cortical dynamics, but to make the state jump from one to another in the right way.

Posted by Anders3 at April 19, 2011 05:36 PM