
Complexity in Neuronal Networks
... structure in the same neuron. More and more recordings from central neurons have shown the prevalence of complex voltage- and time-dependent firing properties which must shape, to a certain extent, circuit function; 2) digital computer simulation power has increased tremendously and allows simulatio ...
... structure in the same neuron. More and more recordings from central neurons have shown the prevalence of complex voltage- and time-dependent firing properties which must shape, to a certain extent, circuit function; 2) digital computer simulation power has increased tremendously and allows simulatio ...
Bayesian Computation in Recurrent Neural Circuits
... by a model computing the log-likelihood ratio of one target over the other (Carpenter & Williams, 1995). In another study, the saccadic response time distribution of monkeys could be predicted from the time taken by neural activity in area FEF to reach a fixed threshold (Hanes & Schall, 1996), sugges ...
... by a model computing the log-likelihood ratio of one target over the other (Carpenter & Williams, 1995). In another study, the saccadic response time distribution of monkeys could be predicted from the time taken by neural activity in area FEF to reach a fixed threshold (Hanes & Schall, 1996), sugges ...
Extended Liquid Computing in Networks of Spiking Neurons
... of processing elements give them the ability to store bits of information in their stable states (attractors in the language of dynamical systems). As shown in Figure 2, the global scheme for machine learning is simple: the neural network, which is a functionnal (or filter) of the input function u(· ...
... of processing elements give them the ability to store bits of information in their stable states (attractors in the language of dynamical systems). As shown in Figure 2, the global scheme for machine learning is simple: the neural network, which is a functionnal (or filter) of the input function u(· ...
Bayesian Retrieval In Associative Memories With Storage Errors
... the conditional probability distribution over possible patterns for retrieval. This contains all possible information that is available to an observer of the network and the initial input. Since this distribution is over exponentially many patterns, we use it to develop two approximate, but tractabl ...
... the conditional probability distribution over possible patterns for retrieval. This contains all possible information that is available to an observer of the network and the initial input. Since this distribution is over exponentially many patterns, we use it to develop two approximate, but tractabl ...
cereb cort
... While it is sufficient in certain circumstances for a single node to represent the input (local coding) it is desirable in many other situations to have multiple nodes providing a factorial or distributed representation. As an extremely simple example consider three inputs (‘a’, ‘b’ and ‘c’) each of ...
... While it is sufficient in certain circumstances for a single node to represent the input (local coding) it is desirable in many other situations to have multiple nodes providing a factorial or distributed representation. As an extremely simple example consider three inputs (‘a’, ‘b’ and ‘c’) each of ...
Beyond Control: The Dynamics of Brain-Body
... Although our earliest work on the evolution of walking utilized a traditional binary genetic algorithm, we switched to a real-valued evolutionary algorithm in subsequent work (Bäck, 1996). In this case, each individual is encoded as a vector of real numbers representing the time constants, biases a ...
... Although our earliest work on the evolution of walking utilized a traditional binary genetic algorithm, we switched to a real-valued evolutionary algorithm in subsequent work (Bäck, 1996). In this case, each individual is encoded as a vector of real numbers representing the time constants, biases a ...
Computing auditory perception - Machine Learning Group, TU Berlin
... We can also take a top-down approach. We can observe human performance of auditory activity taken as a whole, by means of psychological experiments. Experiments give rise to hypotheses about underlying cognitive principles that can be manifested by statistical inference. The principles discovered ca ...
... We can also take a top-down approach. We can observe human performance of auditory activity taken as a whole, by means of psychological experiments. Experiments give rise to hypotheses about underlying cognitive principles that can be manifested by statistical inference. The principles discovered ca ...
Void fraction and flow regime determination by means of MCNP
... ANN based modeling In this non-invasive void fraction measuring system, there is one to one mapping between every void fraction in a special regime type and the obtained corresponding spectra. There are some clear peaks in the spectra: for detectors at 180° and 140°, the two peaks of 50 and 59.5 keV ...
... ANN based modeling In this non-invasive void fraction measuring system, there is one to one mapping between every void fraction in a special regime type and the obtained corresponding spectra. There are some clear peaks in the spectra: for detectors at 180° and 140°, the two peaks of 50 and 59.5 keV ...
Acquisition of Box Pushing by Direct-Vision
... Two of the three outputs are used as actor outputs. Each of them is used to generate a motor command for the right or left wheel. The random number added to each actor output as a trial factor is an uniform random number powered by 3.0 whose value range is -0.1 to 0.1. The actor output after added b ...
... Two of the three outputs are used as actor outputs. Each of them is used to generate a motor command for the right or left wheel. The random number added to each actor output as a trial factor is an uniform random number powered by 3.0 whose value range is -0.1 to 0.1. The actor output after added b ...
- Stem-cell and Brain Research Institute
... propagation through time, and recurrent back propagation are avoided [18]. Instead, learning is based on an association between activation vectors generated by sequences in the layer State D with appropriate output responses, described below. In this context, an integral part of this model that was ...
... propagation through time, and recurrent back propagation are avoided [18]. Instead, learning is based on an association between activation vectors generated by sequences in the layer State D with appropriate output responses, described below. In this context, an integral part of this model that was ...
Institute of Psychology C.N.R.
... We have already described some simulations using such a model of development for neural networks [Nolfi and Parisi, in press]. However, in that work the environment had no role in the developmental changes that occurred in the individual under genetic control. In the present model the genetic materi ...
... We have already described some simulations using such a model of development for neural networks [Nolfi and Parisi, in press]. However, in that work the environment had no role in the developmental changes that occurred in the individual under genetic control. In the present model the genetic materi ...
PDF
... what we might call soft and hard switching. Due the existence of a threshold for action potential generation, hard switching can be accomplished by strong inhibition. In other words, a neuron can be switched from a responsive to a nonresponsive state by hyperpolarizing it below threshold so it canno ...
... what we might call soft and hard switching. Due the existence of a threshold for action potential generation, hard switching can be accomplished by strong inhibition. In other words, a neuron can be switched from a responsive to a nonresponsive state by hyperpolarizing it below threshold so it canno ...
Self-Adaptive Genotype-Phenotype Maps
... steps, from its initial state, according to the encoded rule table. ...
... steps, from its initial state, according to the encoded rule table. ...
The Bifurcating Neuron Network 1q
... where each integrator represents a neuron. These two examples suggest that the possibility of a chaotic network out of non-chaotic elements is plentiful. However, we decided to follow the other option, a network of chaotic neurons, for the following reason. Chaotic activity will be more useful in th ...
... where each integrator represents a neuron. These two examples suggest that the possibility of a chaotic network out of non-chaotic elements is plentiful. However, we decided to follow the other option, a network of chaotic neurons, for the following reason. Chaotic activity will be more useful in th ...
Neural Networks
... • Neurons grouped into networks – Axons send outputs to cells – Received by dendrites, across synapses ...
... • Neurons grouped into networks – Axons send outputs to cells – Received by dendrites, across synapses ...
Chapter 2 Decision-Making Systems, Models, and Support
... • Neurons grouped into networks – Axons send outputs to cells – Received by dendrites, across synapses ...
... • Neurons grouped into networks – Axons send outputs to cells – Received by dendrites, across synapses ...
Artificial neural network
In machine learning and cognitive science, artificial neural networks (ANNs) are a family of statistical learning models inspired by biological neural networks (the central nervous systems of animals, in particular the brain) and are used to estimate or approximate functions that can depend on a large number of inputs and are generally unknown. Artificial neural networks are generally presented as systems of interconnected ""neurons"" which exchange messages between each other. The connections have numeric weights that can be tuned based on experience, making neural nets adaptive to inputs and capable of learning.For example, a neural network for handwriting recognition is defined by a set of input neurons which may be activated by the pixels of an input image. After being weighted and transformed by a function (determined by the network's designer), the activations of these neurons are then passed on to other neurons. This process is repeated until finally, an output neuron is activated. This determines which character was read.Like other machine learning methods - systems that learn from data - neural networks have been used to solve a wide variety of tasks that are hard to solve using ordinary rule-based programming, including computer vision and speech recognition.