Download input output - Brian Nils Lundstrom

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Holonomic brain theory wikipedia , lookup

Binding problem wikipedia , lookup

Activity-dependent plasticity wikipedia , lookup

Electrophysiology wikipedia , lookup

Neuroethology wikipedia , lookup

Perception of infrasound wikipedia , lookup

Axon wikipedia , lookup

Apical dendrite wikipedia , lookup

Neurotransmitter wikipedia , lookup

Catastrophic interference wikipedia , lookup

Clinical neurochemistry wikipedia , lookup

Allochiria wikipedia , lookup

Synaptogenesis wikipedia , lookup

Response priming wikipedia , lookup

Molecular neuroscience wikipedia , lookup

Multielectrode array wikipedia , lookup

Mirror neuron wikipedia , lookup

Recurrent neural network wikipedia , lookup

Neuroanatomy wikipedia , lookup

Nonsynaptic plasticity wikipedia , lookup

Neural oscillation wikipedia , lookup

Neural modeling fields wikipedia , lookup

Rheobase wikipedia , lookup

Metastability in the brain wikipedia , lookup

Single-unit recording wikipedia , lookup

Circumventricular organs wikipedia , lookup

Caridoid escape reaction wikipedia , lookup

Development of the nervous system wikipedia , lookup

Neural correlates of consciousness wikipedia , lookup

Premovement neuronal activity wikipedia , lookup

Convolutional neural network wikipedia , lookup

Psychophysics wikipedia , lookup

Central pattern generator wikipedia , lookup

Optogenetics wikipedia , lookup

Types of artificial neural networks wikipedia , lookup

Neuropsychopharmacology wikipedia , lookup

Pre-Bötzinger complex wikipedia , lookup

Efficient coding hypothesis wikipedia , lookup

Stimulus (physiology) wikipedia , lookup

Channelrhodopsin wikipedia , lookup

Feature detection (nervous system) wikipedia , lookup

Biological neuron model wikipedia , lookup

Synaptic gating wikipedia , lookup

Nervous system network models wikipedia , lookup

Neural coding wikipedia , lookup

Transcript
“The Response of Single Neurons to Varying Stimulus Statistics” by Brian Lundstrom
Introduction
The fundamental unit of the nervous system is the neuron. Each neocortical neuron receives
input from 5,000-10,000 other neurons [1], processes this input, and then outputs a signal in the
form of action potentials, which are sudden changes in the neuronal membrane voltages (Figure
1). Neurons communicate information through trains of these action potentials, or spikes, and it
is in this way that humans can, for example, consciously respond to environmental cues in only
hundreds of milliseconds [2]. Yet, the manner in which action potentials represent information,
sometimes known as the neural code, remains unknown [3-5]. Some suggest that action
potentials are generated according to very simple rules, such as simply summing inputs [6-8],
but evidence as presented here supports an alternative, that neurons can perform complex
computations by spiking.
input
output
voltage
current
time
layer 2/3 pyramidal neuron from rat neocortex
time
Figure 1: Neurons receive current inputs, and respond with an output of action potentials or spikes.
The total input that a neuron receives at its cell body comes from as many as 10,000 neurons. Parts of this
analog input are effectively transformed into an output of all-or-none action potentials, which can be
thought of as a digital, or binary, signal. (Data from Brian Lundstrom, Matthew Higgs, and William
Spain.)
This dissertation treats one link in the chain of events that creates, transmits, and interprets the
neural code. It focuses on the small piece of excitable membrane near the cell body that
generates action potentials [9]. We recorded spikes generated from this piece of membrane in
response to a variety of inputs in order to map the transformation between input and output.
1
“The Response of Single Neurons to Varying Stimulus Statistics” by Brian Lundstrom
Our goal was to consider inputs that fluctuate randomly in time as a model for realistic
neuronal inputs, and to discover how the transformation from input to output depends on
statistical characteristics of the input. We considered this transformation for the cases in which
the stimulus statistics were at steady state as well as when they varied with time.
We obtained data from both model neurons, simulated by computer, and single in vitro
pyramidal neurons, recorded from rat brain slices. These neurons can be considered to be rather
generic neurons of the neocortex. For modeling and experimental data, we used Gaussian white
noise injected current inputs, which both approximates the inputs the neuron’s cell body would
receive in vivo [10-15, for discussion, see 16], and has a number of analytical advantages [3, 17].
In a Gaussian white noise input, the stimulus value at each time point is independently chosen
from a Gaussian distribution1, and this kind of input can be characterized by two statistics: its
mean and its variance (Figure 2). The variance quantifies fluctuations in the input signal, and
only recently has there been a concerted effort towards understanding how it affects the
neuron’s output [18-26].
Mean (!) increases
input
magnitude
time
Variance ("2) increases
Figure 2: A Gaussian white noise input can be described by its mean, or average value, and variance,
or fluctuations about the average. For each time point, the input magnitude is chosen from a Gaussian
distribution with some mean and variance. Variance (σ2) is the square of standard deviation (σ), which is
a measure of input fluctuations.
1
All Gaussian white noise inputs used were actually filtered by an exponential kernel (see
Chapters 2-5 for details). Strictly speaking, the inputs were somewhat correlated in time and
therefore not truly “white” but rather “colored” noise.
2
“The Response of Single Neurons to Varying Stimulus Statistics” by Brian Lundstrom
Having specified the input to this excitable membrane, we must also quantify its output. This
dissertation focuses on the mean rate of spikes, which is how many spikes are on average
produced in a given amount of time. This is the simplest form of spike encoding, or way in
which input information can be stored in spike trains [6, 7, 27-29]. One straightforward way to
characterize neuronal responses is by determining the mean firing rate that results as stimulus
parameters are varied. As illustrated in Figure 3, the input mean and input variance can be
related to the mean firing rate by using f-I, or frequency-current, curves, where each trace of the
plot represents a different input variance, or standard deviation.
As statistics of the stimulus change, the response of the neuron changes. This transformation
between input and output defines how the neuron encodes stimulus information. Our work
examined how the encoding of time-varying stimuli by the spike trains of single neurons
depends on underlying neuronal biophysical parameters, such as the density and dynamics of
ion channels.
a
b
input
1
2
3
3
50
SD = 0 pA
200
400
600
output
1
30
Hz
Firing rate
40
20
10
2
3
0
0.0
1
0.5
Mean (nA)
Current
1.0
1.5
2
Figure 3: Firing rate-input (f-I) curves can be used to relate the input mean and variance to the mean
firing rate that each input condition elicits. (a) Conditions 1, 2, and 3 represent three different input
conditions (top). For example, the difference between condition 2 and 3 is that the input standard
deviation (SD) increased. These input conditions elicited different firing rate responses (bottom). (b)
When different traces in the f-I curve are well separated, the neuron is sensitive to input fluctuations,
since increasing input fluctuations lead to an increasing mean firing rate output.
3
“The Response of Single Neurons to Varying Stimulus Statistics” by Brian Lundstrom
Three types of steady state neuron responses
First, we considered the case when the time-varying stimuli had steady state stimulus statistics,
that is, how action potential generation depended on the stimulus’s statistical properties when
those properties were fixed, i.e. they did not change in time. Previous in vitro experimental
observations have shown that the firing rate responses of some neurons are very sensitive to
input fluctuations, while responses of others are not [18, 21]. Interestingly, we found that
commonly used neuron models (single-compartment, biophysical models) do not replicate this
diversity with their standard parameter values. Therefore, we used techniques from dynamical
systems analysis to examine simplified dynamical models in order to determine the
mathematical conditions necessary for this diversity in sensitivity to input fluctuations. We
found that these kinds of models (single-compartment, biophysical models) can replicate this
diversity if their parameters are adjusted appropriately. We related the necessary mathematical
conditions to biophysical properties of single neurons, such as ion channel densities and
dynamics in the excitable membrane [30, 31]. This work allowed us to categorize neurons into
three novel, fundamental types based on how their mean firing rate depended on input
fluctuations at steady state (Figure 4).
4
“The Response of Single Neurons to Varying Stimulus Statistics” by Brian Lundstrom
a
Interneuron
PFC pyramidal neuron
auditory neuron
400
200
Firing Rate (Hz)
Firing Rate (Hz)
Firing Rate (Hz)
200
40
20
SD = 0
Low SD
High SD
0
0.5
Mean current (nA)
b
100
0
0
1
0
0.6
Mean current (nA)
9:;-*5
9:;-*>@
0.8
Mean current (nA)
1.6
9:;-*>!
#=!
*
%&'&()*'+,-*./01
$!!
?!
<=
#!!
#=
$
AB*C*!*µ5678
#!
$!
!
*
!
"!
#$!
2-+(*34''-(,*.µ5678$1
!
"!
#$!
2-+(*34''-(,*.µ5678$1
!
"!
#$!
2-+(*34''-(,*.µ5678$1
Figure 4: Existing in vitro data from single neurons can be classified into three novel response types
according to sensitivity to input fluctuations. (a) Inhibitory interneurons show little sensitivity to input
fluctuations for high mean currents, while excitatory prefrontal cortex (PFC) pyramidal neurons and
auditory neurons show sensitivity to input fluctuations throughout their physiological dynamic range.
Data are from Arsiero et al. (2007) and Higgs et al. (2006). (b) A single-compartment, biophysical model
can replicate these characteristics when the densities and dynamics of its ion channels are adjusted
appropriately. The adjustments necessary to produce the three types correspond with known data.
These results provide a framework for understanding how the steady-state mean firing rate of a
neuron depends on the input mean and variance. For example, the modeling of Figure 4b
suggested that a necessary difference between Type A and Type B+ responses is the
requirement for a slow adaptation current if the neuron is to have a Type B+ response. These
kinds of currents influence how the mean spike rate adapts to a constant stimulus, and are not
typically thought of as influencing steady-state sensitivity to input fluctuations. Our results then
provide a framework for understanding recent data that showed a positive correlation between
slow adaptation currents and sensitivity to input fluctuations similar to Type B+ responses
(Figure 5), as well as data related to neuronal responses in the auditory brainstem (John Rinzel
5
“The Response of Single Neurons to Varying Stimulus Statistics” by Brian Lundstrom
et al., unpublished data) similar to Type B-. Thus, this framework assists in classifying single
neuron responses as well as understanding the salient properties of highly complex in vitro
neurons.
a
b
pyramidal neuron with large sAHP
pyramidal neuron with small sAHP
50
80
SD = 0 pA
200
400
600
40
60
Hz
Hz
30
40
20
20
10
0
0
0.0
0.5
Mean (nA)
1.0
1.5
0.0
0.5
1.0
Mean (nA)
1.5
2.0
Higgs et al., J Neurosci, 2006
Figure 5: Existing data show that neurons with more adaptation current show greater sensitivity to
input fluctuations. (a) Pyramidal neurons from layer 2/3 of the neocortex that have a large slow afterhyperpolarization (sAHP) current show greater sensitivity to input fluctuations than (b) neurons with
small sAHP currents. sAHP currents are one kind of slow adaptation current, which provides increasing
negative feedback as firing rate increases.
Fractional differentiation predicts responses for time-varying stimulus statistics
In addition to neuronal responses to steady state stimulus statistics, we considered the more
general problem of an input with a time-varying mean or variance, which would approximate
time-varying changes in the overall number of active inputs to the neuronal cell body or the
synchrony between inputs, respectively. After a change in the stimulus, the firing rate of
neurons shows adaptation. Typically, this adaptation has been thought to occur with a fixed
time scale. However, previous data from the fly visual system [26] showed that the time scale of
firing rate adaptation depends on how the stimulus changes (Figure 6).
6
“The Response of Single Neurons to Varying Stimulus Statistics” by Brian Lundstrom
R ate (s pikes s –1 )
60
50
40
30
20
0
10
20
30
40
Time (s )
Figure 6: The apparent time scale of firing rate adaptation depends on the stimulus period. (a) In this
example of a switching experiment, the variance of a Gaussian white noise stimulus is changed
periodically between two values (top). Following an increase in stimulus variance, the average firing rate
of a motion sensitive H1 cell in the fly visual system (bottom) increased transiently and then relaxed
toward a baseline (colored traces). The slow relaxation in firing rate increased as the stimulus period was
increased from 5 sec (red trace) to 40 sec (gray trace). Data from Fairhall et al. [26] as in Wark et al. [32].
We wanted to know whether this multiple time-scale adaptation could be a property of single
neurons, rather than requiring a neural circuit. Therefore, we explored how the mean firing rate
of pyramidal neurons in the rat sensorimotor cortex adapted to inputs [33]. As in Figure 6, we
found evidence that how quickly the firing rate relaxes depends on how quickly the stimulus
switches between low and high values (Figure 7). As the stimulus period increased, the firing
rate adaptation slowed proportionately, and this suggests that a multiple time-scale process
underlies the neuron’s rate adaptation.
7
“The Response of Single Neurons to Varying Stimulus Statistics” by Brian Lundstrom
b
0.55
,
5)67!8*!4*9
:*9!8*!7)67
%
0.45
0
2
4
Firing Rate (Hz)
20
-)2',/034',!
Current input (nA)
a
#
10
,
!
Neuron response
Exponential fits
0
2
Time (sec)
"!
#!
&'()*+,-,./'01
$!
4
Figure 7: As the period of the stimulus increases, the firing rate adapts more slowly. (a) The injected
current stimulus into the single neuron switched between low and high variance values (top), as in Figure
6. Here, a 4 sec period is displayed, and the firing rate response was fit with two exponential curves of the
form A exp (-t /τ), where a larger time constant τ signifies a firing rate that adapts more slowly. (b) The
time scale τ increases roughly linearly as the stimulus period T increases.
To further examine what kind of multiple-time scale process underlies this rate adaptation, we
recorded neuronal firing rates in response to sinusoidal input currents. Using sine wave inputs
is a standard method of linear systems analysis by which one characterizes the phase and gain
of the system for individual input frequencies or periods. In this case, the system in question is
that of neuronal spike generation, and we examined the phases and gains of the neuronal firing
rate output with respect to sine wave inputs. We found that the phase leads of the responses
were roughly constant with respect to stimulus period and that the gains decreased as a power
law as stimulus period increased. These two characteristics suggested that the neuronal
response to time-varying inputs might be described by fractional differentiation, since these
characteristics are properties of a fractionally differentiating linear system.
Derivatives are commonly taken using integer orders, such as first or second derivatives. For
example, the first derivative of a straight line is its slope. However, non-integer order
8
“The Response of Single Neurons to Varying Stimulus Statistics” by Brian Lundstrom
derivatives, such as order 0.5, are equally well defined mathematically, and the 0.5 order
derivative of a straight line is something between the straight line and its slope. Figure 8
contains some examples of fractional derivatives of a square wave. Fractional differentiation
[34-38] is a linear operation that can be written in the time (t) or frequency (f) domain as:
F
r ( t ) = h ( t ) ∗ x ( t ) ⇔ R ( f ) = H ( f ) X ( f ) = ( i2π f ) X ( f ) ,
α
(1)
where r(t) is the firing rate response to a time-varying input x(t), h(t) is the fractional
differentiating filter in the time domain, and H(f) is the Fourier-transformed filter in the
frequency domain. In the frequency domain, the filter H(f) is (i2πf)α, where α is the order of
fractional differentiation2. In our experiments x(t) was the time-varying mean or variance of the
Gaussian white noise stimuli.
#%!
%
#
&
!
%
"%'
$
$
"%!
#
"
"
!"%!
!#
"%$
#
"
"
"
!#
!"%$
!$
!#
!$
!&
!#%!
!!"
"
!"
!=0
#""
&'()*+,)-.
#!"
$""
!% $!"
!!"
!"%'
"
!"
#!"
!#""= 0.25
$""
!!
!!"
$!"
"
!"
#!"
!#""= 0.50
$""
!%
!!" $!"
"
!"
#!"
! #""= 0.75
$""
$!"
!"%&
!!"
"
!"
!
= 1#!"
#""
$""
Figure 8: Examples of fractional derivatives where α is the order of the derivative such that α = 0
means that no derivative was taken and α = 1 represents the first derivative of the repeating square
wave.
The mathematical properties of fractional differentiation allowed us to use neuronal response
data to find the order of fractional differentiation for these neurons in eight different ways, and
in each case we found the result to be insignificantly different from 0.15. This suggests that for
rat neocortical pyramidal neurons, we can use Eq. (1) with α = 0.15 to predict neuronal
responses to arbitrary stimuli waveforms x(t). One example is shown in Figure 9, where we see
that α = 0.15 well predicts the neuronal response to the three-step stimulus. Thus, the timevarying response of these rather generic neocortical neurons can be described by this single
number.
In general, the order of fractional differentiation α is the only parameter that one needs to
know in order to use Eq. (1) to predict neuronal firing rates. However, the predicted waveforms
may also need an offset, which can be incorporated.
2
9
$!"
Firing rate (Hz)
“The Response of Single Neurons to Varying Stimulus Statistics” by Brian Lundstrom
15
5
0
Firing rate
! = 0.05
! = 0.15
! = 0.25
8
Time (sec)
16
24
Figure 9: Fractional differentiation of order α = 0.15 predicts the neuronal response for layer 2/3
pyramidal neurons in the rat neocortex. This stimulus was comprised of [high medium low medium]
variance steps for [8 4 8 4] sec, respectively. α = 0.05 and α = 0.25 are shown for comparison.
Inter-spike intervals can resolve ambiguities introduced by rate adaptation
From a coding perspective, one potential problem with the mean firing rate is that as a
consequence of rate adaptation, the mean firing rate can map onto more than one set of
stimulus parameters. For example, a mean firing rate of 20 spikes/sec might sometimes
correspond to a stimulus variance of 100 pA but at other times corresponds to a variance 200
pA, depending on how much the neuron has adapted due to previous stimuli. However, we
showed that under these circumstances, spike trains nonetheless contain accurate information
about the stimulus statistics, encoded in the intervals between spikes rather than in the spike
rate [39]. Even during slow rate adaptation, which occurs on the order of seconds, neurons can
convey accurate information about sudden changes in stimulus variance within ~100 msec in
their inter-spike intervals.
10
“The Response of Single Neurons to Varying Stimulus Statistics” by Brian Lundstrom
Conclusion
Results of this dissertation address three primary issues: (1) how a neuron’s mean firing rate
depends on steady state stimulus statistics, (2) how a neuron’s mean firing rate changes with
time-varying statistics, and (3) how inter-spike intervals can provide information about the
stimulus that is unavailable from the mean firing rate. The initial chapter of this dissertation
synthesizes these results and outlines a way in which the stimulus mean and variance can be
decoded from the spike train for arbitrary inputs, although future work is needed to test this
hypothesis. In the past, it has been suggested that single neurons perform only the simplest of
computations as they process information, and that the power of the brain comes only from the
interactions of many neurons. However, work as described here suggests that even single
neurons of the neocortex compute a complex and mathematically elegant function of their
inputs.
References
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
Pakkenberg, B., et al., Aging and the human neocortex. Exp Gerontol, 2003. 38(1-2): p.
95-9.
VanRullen, R., R. Guyonneau, and S.J. Thorpe, Spike times make sense. Trends Neurosci,
2005. 28(1): p. 1-4.
Rieke, F., et al., Spikes: exploring the neural code. Computational neuroscience. 1997,
Cambridge, Mass.: MIT Press. xvi, 395 p.
Koch, C. and I. Segev, The role of single neurons in information processing. Nat
Neurosci, 2000. 3 Suppl: p. 1171-7.
Koch, C., Biophysics of computation: information processing in single neurons.
Computational neuroscience. 1999, New York: Oxford University Press. xxiii, 562 p.
Shadlen, M.N. and W.T. Newsome, Noise, neural codes and cortical organization. Curr
Opin Neurobiol, 1994. 4(4): p. 569-79.
Dayan, P. and L.F. Abbott, Theoretical neuroscience : computational and mathematical
modeling of neural systems. Computational neuroscience. 2001, Cambridge, Mass.:
Massachusetts Institute of Technology Press. xv, 460 p.
Trappenberg, T.P., Fundamentals of Computational Neuroscience. 2002: Oxford
University Press, USA. 360.
Shu, Y., et al., Properties of action potential initiation in neocortical pyramidal cells:
evidence from whole cell axon recordings. J Neurophysiol, 2006.
Moreno, R., et al., Response of spiking neurons to correlated inputs. Phys Rev Lett, 2002.
89(28 Pt 1): p. 288101.
Richardson, M.J., Effects of synaptic conductance on the voltage distribution and firing
rate of spiking neurons. Phys Rev E Stat Nonlin Soft Matter Phys, 2004. 69(5 Pt 1): p.
051918.
11
“The Response of Single Neurons to Varying Stimulus Statistics” by Brian Lundstrom
12.
13.
14.
15.
16.
17.
18.
19.
20.
21.
22.
23.
24.
25.
26.
27.
28.
29.
30.
31.
Rudolph, M. and A. Destexhe, On the use of analytical expressions for the voltage
distribution to analyze intracellular recordings. Neural Comput, 2006. 18(12): p. 291722.
Rudolph, M. and A. Destexhe, An extended analytic expression for the membrane
potential distribution of conductance-based synaptic noise. Neural Comput, 2005.
17(11): p. 2301-15.
Destexhe, A., M. Rudolph, and D. Pare, The high-conductance state of neocortical
neurons in vivo. Nat Rev Neurosci, 2003. 4(9): p. 739-51.
Destexhe, A., et al., Fluctuating synaptic conductances recreate in vivo-like activity in
neocortical neurons. Neuroscience, 2001. 107(1): p. 13-24.
Rauch, A., et al., Neocortical pyramidal cells respond as integrate-and-fire neurons to in
vivo-like input currents. J Neurophysiol, 2003. 90(3): p. 1598-612.
Simoncelli, E.P., et al., Characterization of neural responses with stochastic stimuli, in
The New Cognitive Neurosciences, M. Gazzaniga, Editor. 2004, MIT Press. p. 327-338.
Arsiero, M., et al., The impact of input fluctuations on the frequency-current
relationships of layer 5 pyramidal neurons in the rat medial prefrontal cortex. J
Neurosci, 2007. 27(12): p. 3274-84.
Chance, F.S., L.F. Abbott, and A.D. Reyes, Gain modulation from background synaptic
input. Neuron, 2002. 35(4): p. 773-82.
Fellous, J.M., et al., Synaptic background noise controls the input/output characteristics
of single cells in an in vitro model of in vivo activity. Neuroscience, 2003. 122(3): p. 81129.
Higgs, M.H., S.J. Slee, and W.J. Spain, Diversity of gain modulation by noise in
neocortical neurons: regulation by the slow afterhyperpolarization conductance. J
Neurosci, 2006. 26(34): p. 8787-99.
Shu, Y., et al., Barrages of synaptic activity control the gain and sensitivity of cortical
neurons. J Neurosci, 2003. 23(32): p. 10388-401.
Brenner, N., W. Bialek, and R. de Ruyter van Steveninck, Adaptive rescaling maximizes
information transmission. Neuron, 2000. 26(3): p. 695-702.
Kim, K.J. and F. Rieke, Temporal contrast adaptation in the input and output signals of
salamander retinal ganglion cells. J Neurosci, 2001. 21(1): p. 287-99.
Smirnakis, S.M., et al., Adaptation of retinal processing to image contrast and spatial
scale. Nature, 1997. 386(6620): p. 69-73.
Fairhall, A.L., et al., Efficiency and ambiguity in an adaptive neural code. Nature, 2001.
412(6849): p. 787-92.
Adrian, E.D. and Y. Zotterman, The impulses produced by sensory nerve endings: Part 2.
The response of a Single End-Organ. J Physiol, 1926. 61(2): p. 151-171.
Abeles, M., Role of the cortical neuron: integrator or coincidence detector? Isr J Med
Sci, 1982. 18(1): p. 83-92.
Konig, P., A.K. Engel, and W. Singer, Integrator or coincidence detector? The role of the
cortical neuron revisited. Trends Neurosci, 1996. 19(4): p. 130-7.
Lundstrom, B.N., et al., Two Computational Regimes of a Single-Compartment Neuron
Separated by a Planar Boundary in Conductance Space. Neural Comput, 2008.
Lundstrom, B.N., et al., Sensitivity of firing rate to input fluctuations depends on time
scale separation between fast and slow variables in single neurons. Submitted.
12
“The Response of Single Neurons to Varying Stimulus Statistics” by Brian Lundstrom
32.
33.
34.
35.
36.
37.
38.
39.
Wark, B., B.N. Lundstrom, and A. Fairhall, Sensory adaptation. Curr Opin Neurobiol,
2007. 17(4): p. 423-9.
Lundstrom, B.N., et al., Fractional differentiation by neocortical pyramidal neurons.
Submitted.
Kleinz, M. and T.J. Osler, A Child's Garden of Fractional Derivatives. The College
Mathematics Journal, 2000. 31(2): p. 82-88.
Miller, K.S. and B. Ross, An introduction to the fractional calculus and fractional
differential equations. 1993, New York: Wiley. xiii, 366 p.
Oldham, K.B. and J. Spanier, The fractional calculus : theory and applications of
differentiation and integration to arbitrary order. 2006, Mineola, N.Y.: Dover
Publications. xvii, 234 p.
Podlubny, I., Fractional differential equations : an introduction to fractional derivatives,
fractional differential equations, to methods of their solution and some of their
applications. Mathematics in science and engineering ; v. 198. 1999, San Diego:
Academic Press. xxiv, 340 p.
Sokolov, I.M., J. Klafter, and A. Blumen, Fractional Kinetics. Physics Today,
2002(Nov): p. 48-54.
Lundstrom, B.N. and A.L. Fairhall, Decoding stimulus variance from a distributional
neural code of interspike intervals. J Neurosci, 2006. 26(35): p. 9030-7.
13