Single neuron dynamics and computation
Introduction
The computation performed by single neurons can be defined as a mapping from afferent spike trains to the output spike train which is communicated to their postsynaptic targets. This mapping is stochastic, because of various sources of noise that include channel and synaptic noise; and plastic, because of various sources of plasticity, both intrinsic and synaptic.
For many years, the dominant conceptual model for single neuron computation was the binary Mc-Culloch-Pitts neuron [45]. In this model, the input vector is multiplied by a weight vector, and then passed through a threshold (see Fig. 1a). Adjusting synaptic weights and thresholds lead to neurons being able to learn arbitrary linearly separable dichotomies of the space of inputs [63].
This model has been conceptually tremendously useful, but it ignores fundamental temporal and spatial properties of neurons: the complex dynamics generated by a panoply of voltage-gated ionic currents; and the fact that synaptic inputs are stochastic, history-dependent and spread over a large dendritic tree. In this paper, we will review recent advances in our understanding of how these properties affect computation in single neurons.
Section snippets
Computation and dynamics: LNP/GL models and their relationship to neuronal biophysics
Electrophysiological data in various sensory systems have been successfully fitted by linear-non-linear-Poisson (LNP) or generalized linear models (GLM) [65]. In the LNP model, the inputs are first convolved linearly with a temporal filter (also called a kernel - the L operation). This convolution is then passed through a static non-linearity (the N operation), yielding an instantaneous firing rate. Finally, an inhomogeneous Poisson process is generated from the instantaneous firing rate (the P
Impact of dendritic non-linearities on computation
Dendritic trees are highly complex structures allowing for computations that are richer than mere linear summation 39, 9, 67, 42. Qualitatively, four different types of behavior can arise at the level of local dendritic branches, shown schematically in Figure 2:
- (i)
Sub-linear summation due to passive cable properties of thin dendrites has been observed in cerebellar stellate cells [3•], which could allow these cells to be selective to sparse, rather than focused, presynaptic activity;
- (ii)
Linear
Synaptic computation and filtering
The dynamics of synaptic transmission lead to a form of pre-post cell-class specific short-term plasticity that shapes amplitudes of successive post-synaptic potentials (PSPs). This history dependence of the synaptic response (Fig. 3) can be characterised as exhibiting either: depression in which the successive synaptic amplitudes decrease due to depletion of presynaptic resources such as neurotransmitter vesicles that take a finite time - of the order of 100s of milliseconds - to replace; or
Acknowledgements
We thank Dr Gilad Silberberg for use of the data in Figure 3.
References (75)
- et al.
Thin dendrites of cerebellar interneurons confer sublinear synaptic integration and a gradient of short-term plasticity
Neuron
(2012) - et al.
Sparsely synchronized neuronal oscillations
Chaos
(2008) - et al.
Linear summation of excitatory inputs by CA1 pyramidal neurons
Neuron
(1999) - et al.
Gain modulation from background synaptic input
Neuron
(2002) - et al.
Short-term synaptic depression causes a non-monotonic response to correlated stimuli
J. Neurosci.
(2005) - et al.
Graded persistent activity in entorhinal cortex neurons
Nature
(2002) - et al.
Structure-preserving model reduction of passive and quasi-active neurons
J Comput Neurosci
(2013) - et al.
A role for NMDA-receptor channels in working memory
Nat. Neurosci.
(1998) - et al.
Sensitivity of firing rate to input fluctuations depends on time scale separation between fast and slow variables in single neurons
J Comput Neurosci
(2009) - et al.
Synaptic theory of working memory
Science
(2008)