Temporal modulation of the dynamics of neuronal networks with
... trigger behavioral adaptation. We found evidence for (i) high spike count variability and (ii) temporal reliability (favored by temporal correlations) which respectively hindered and favored information transmission when monkeys were cued to switch the behavioral strategy. Also, we investigated the ...
... trigger behavioral adaptation. We found evidence for (i) high spike count variability and (ii) temporal reliability (favored by temporal correlations) which respectively hindered and favored information transmission when monkeys were cued to switch the behavioral strategy. Also, we investigated the ...
Spike-based Winner-Take-All Computation in a Multi
... its performance and describing its implementation in a large-scale multi-chip vision system. The winner-take-all is a neuronal network that amplifies the strongest set of inputs and suppresses output from the others. In various neuroscience models, this function is used to make a selection out of a ...
... its performance and describing its implementation in a large-scale multi-chip vision system. The winner-take-all is a neuronal network that amplifies the strongest set of inputs and suppresses output from the others. In various neuroscience models, this function is used to make a selection out of a ...
Technologies émergentes de mémoire résistive pour les systèmes
... (CBRAM) and Metal-Oxide based Memory (OXRAM) can play in dedicated neuromorphic hardware. We focus on the emulation of synaptic plasticity effects such as long-term potentiation (LTP), long term depression (LTD) and spike-timing dependent plasticity (STDP) with RRAM synapses. We developed novel low- ...
... (CBRAM) and Metal-Oxide based Memory (OXRAM) can play in dedicated neuromorphic hardware. We focus on the emulation of synaptic plasticity effects such as long-term potentiation (LTP), long term depression (LTD) and spike-timing dependent plasticity (STDP) with RRAM synapses. We developed novel low- ...
Stochastic neural network dynamics: synchronisation and control
... been made particularly prominent. The brain has long been considered a sophisticated organic computing machine; computational neuroscience dates back to 1907, where the integrate and fire model of a neuron was first introduced [1]. Since then, neuronal models of varying complexity have been proposed ...
... been made particularly prominent. The brain has long been considered a sophisticated organic computing machine; computational neuroscience dates back to 1907, where the integrate and fire model of a neuron was first introduced [1]. Since then, neuronal models of varying complexity have been proposed ...
Synaptic plasticity: taming the beast
... total level of synaptic efficacy. A frequent approach in neural network models is to globally adjust all the synapses onto each postsynaptic neuron based on its level of activity3. The adjustment can take two forms, depending on whether the synapses to a particular neuron are changed by the same amo ...
... total level of synaptic efficacy. A frequent approach in neural network models is to globally adjust all the synapses onto each postsynaptic neuron based on its level of activity3. The adjustment can take two forms, depending on whether the synapses to a particular neuron are changed by the same amo ...
Modeling multiple time scale firing rate adaptation in a neural
... scale adaptation of a particular weighting in conductancebased models is difficult, as is assessing the effect of differing adaptation dynamics on neural networks. Here, the intent is to describe an approach for modeling multiple time scale rate adaptation in a neural network and demonstrate its use ...
... scale adaptation of a particular weighting in conductancebased models is difficult, as is assessing the effect of differing adaptation dynamics on neural networks. Here, the intent is to describe an approach for modeling multiple time scale rate adaptation in a neural network and demonstrate its use ...
Dynamics of sensory processing in the dual olfactory pathway of the
... mixtures at the AL output (Galizia and Kimmerle 2004; Krofczik et al. 2008; Yamagata et al. 2009). 3.2. Characteristic differences of lateral and median uniglomerular projection neurons Based on intracellular recording and staining of uniglomerular projection neurons, Müller et al. (2002) reported t ...
... mixtures at the AL output (Galizia and Kimmerle 2004; Krofczik et al. 2008; Yamagata et al. 2009). 3.2. Characteristic differences of lateral and median uniglomerular projection neurons Based on intracellular recording and staining of uniglomerular projection neurons, Müller et al. (2002) reported t ...
Social equality in the number of choice options is represented in the
... fixed for all participants (1,000 yen) so as not to influence the results of the subsequent ...
... fixed for all participants (1,000 yen) so as not to influence the results of the subsequent ...
Convergence in Mammalian Nucleus of Solitary Tract During
... essentialcomponent of sensory circuits, and knowledge of receptive field development is important in understandingfunctional differentiation of sensorypathways. To understand better the development and maturation of neural circuits for salt taste processing,we have mademeasures of receptive field si ...
... essentialcomponent of sensory circuits, and knowledge of receptive field development is important in understandingfunctional differentiation of sensorypathways. To understand better the development and maturation of neural circuits for salt taste processing,we have mademeasures of receptive field si ...
Learning to classify complex patterns using a VLSI network of
... difficult computational problem that artificial neural networks are confronted with. The performance of classical neural network models depends critically on an unrealistic feature, the fact that their synapses have unbounded weight. In contrast, biological synapses face the hard limit of physical b ...
... difficult computational problem that artificial neural networks are confronted with. The performance of classical neural network models depends critically on an unrealistic feature, the fact that their synapses have unbounded weight. In contrast, biological synapses face the hard limit of physical b ...
Attention induces synchronization-based response gain in steady
... population activity by recording frequency-tagged SSVEPs from both attended and ignored stimuli simultaneously (thus controlling for influences on SSVEPs that were unrelated to attention), analyzing the scalp topography of attention effects (crucial for evaluating the attentional response and activi ...
... population activity by recording frequency-tagged SSVEPs from both attended and ignored stimuli simultaneously (thus controlling for influences on SSVEPs that were unrelated to attention), analyzing the scalp topography of attention effects (crucial for evaluating the attentional response and activi ...
Cognon Neural Model Software Verification and
... • On-line learning allows the brain to keep up with the amount of new information that enters through sensory inputs, by continuously refining the internal models of the world. When sensory data enters the brain there is no time to store it and process it later. Every new input needs to be processed ...
... • On-line learning allows the brain to keep up with the amount of new information that enters through sensory inputs, by continuously refining the internal models of the world. When sensory data enters the brain there is no time to store it and process it later. Every new input needs to be processed ...
Artificial neural network
In machine learning and cognitive science, artificial neural networks (ANNs) are a family of statistical learning models inspired by biological neural networks (the central nervous systems of animals, in particular the brain) and are used to estimate or approximate functions that can depend on a large number of inputs and are generally unknown. Artificial neural networks are generally presented as systems of interconnected ""neurons"" which exchange messages between each other. The connections have numeric weights that can be tuned based on experience, making neural nets adaptive to inputs and capable of learning.For example, a neural network for handwriting recognition is defined by a set of input neurons which may be activated by the pixels of an input image. After being weighted and transformed by a function (determined by the network's designer), the activations of these neurons are then passed on to other neurons. This process is repeated until finally, an output neuron is activated. This determines which character was read.Like other machine learning methods - systems that learn from data - neural networks have been used to solve a wide variety of tasks that are hard to solve using ordinary rule-based programming, including computer vision and speech recognition.