Abstract:
Due to its strikingly regular structure, the cerebellum is widely thought to implement a universal neuronal computation. The leading candidate is the `adaptive filter' which is analogous to an analysis-synthesis filter whose output weights are modified by a simple synaptic learning rule. In this formulation, the cerebellar granular layer forms part of the analysis pathway, and is commonly assumed to implement a spatio-temporal recoding where inputs are recombined into an expanded set of output signals. The nature of the recoding is unknown, although its dense connectivity suggests that circuit-level mechanisms play an important role, a view supported by simulations of recurrent neural networks. By developing computational simulations of neural network models of the cerebellar granular layer, I examine how the structure of neural networks enables them to effectively generate adaptive filter basis signals, and relate this to the known granular layer microcircuit. `Cerebellum-like' structures in sharks and electric sh are thought to be the evolutionary precursor to the cerebellum, and have been characterised as adaptive filters which cancel the predictable component of a sensory signal. The sophistication of recoding implemented by cerebellum-like structures appears to increase through evolutionary time in a way that parallels increasingly recurrent connectivity. Networks constructed using a neural network training algorithm demonstrate the potential versatility of the granular layer circuit, whereas a more realistic `winner-take-all' network reproduces some of its experimentally known properties.