Download Physical Neural Networks Jonathan Lamont November 16, 2015

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Neural oscillation wikipedia , lookup

Donald O. Hebb wikipedia , lookup

Memory consolidation wikipedia , lookup

Artificial consciousness wikipedia , lookup

Neural modeling fields wikipedia , lookup

Neural coding wikipedia , lookup

Central pattern generator wikipedia , lookup

Neurotransmitter wikipedia , lookup

State-dependent memory wikipedia , lookup

Artificial general intelligence wikipedia , lookup

Neuropsychopharmacology wikipedia , lookup

Machine learning wikipedia , lookup

Holonomic brain theory wikipedia , lookup

History of artificial intelligence wikipedia , lookup

Synaptogenesis wikipedia , lookup

Sparse distributed memory wikipedia , lookup

Artificial intelligence wikipedia , lookup

Biological neuron model wikipedia , lookup

Chemical synapse wikipedia , lookup

Metastability in the brain wikipedia , lookup

Catastrophic interference wikipedia , lookup

Nonsynaptic plasticity wikipedia , lookup

Neural engineering wikipedia , lookup

Synaptic gating wikipedia , lookup

Activity-dependent plasticity wikipedia , lookup

Convolutional neural network wikipedia , lookup

Development of the nervous system wikipedia , lookup

Artificial neural network wikipedia , lookup

Nervous system network models wikipedia , lookup

Recurrent neural network wikipedia , lookup

Types of artificial neural networks wikipedia , lookup

Transcript
PhysicalNeuralNetworks
JonathanLamont
November 16,2015
Contents
•
•
•
•
ArtificialNeuralNetwork
HistoryofDevelopment
AHaH Computation
FutureGoalsofAHaH System
NeuralComputation
• Understandingbiologicalneuralsystemsfalls
inaspectrumbetweenmimicryand
algorithmicsolutions
• Tomimicthebrain,shiftquestionfrom“how
dobrainscompute?”to“howdobrainsbuild
andrepairthemselvesasdissipativeattractorbasedstructures?”
WhatisanArtificialNeuralNetwork?
• ANNsaregenerallydenselyconnectedgroups
ofindividualcells,calledneurons,that
producedesiredoutputsfortrainedinputs
(trainedratherthanprogrammed)
• Inspiredbybiologicalneuralnetworks
• Usedinmachinelearningandcognitive
sciencetoestimatefunctions
NeuronDiagram
ANNDiagram
InspirationforPhysical
Implementation
• IBM’sSimulationwith1BillionNeuronsand10
TrillionSynapsesrequired147,456CPUs,144TBof
memoryrunning,andrequires2.9MWsofpowerat
1/83real-time.
– Wouldrequire7GWofpowertosimulatehumancortexbasedon
theseestimates, usingvonNeumannArchitecture
– Averagehumanneocortex contains150,000billionconnections
• CoreAdaptivePowerProblem:Energywastedduring
memory-processorcommunication.
AdaptivePowerProblem
• Constantdissipationoffreeenergyallowsliving
systemstoadaptatallscales
• Eachadaptationmustreducetomemory-processor
communicationasstatevariablesaremodified
– Energyconsumedinmovingthisinformationgrows
linearlywithnumberofstatevariablesthatmustbe
continuouslymodified
HistoryofDevelopment
• AlanTuring’s‘IntelligentMachinery’(1948)
– Describes machineconsistingofartificialneuronsconnectedinany
patternwithmodifierdevices
– Modifierdevices couldpassordestroyasignal
• Hebb proposedsynapticplasticity(1949)
– Describestheabilityofsynapsestostrengthorweaken over
timeinresponsetoincreasesordecreasesinactivity
• ADALINEor“AdaptiveLinearNeuron”(1960)
– Physicaldeviceusingelectrochemicalplatingofcarbonrodsto
emulatesynapticelementscalledmemistors.
– Memistor – Resistorwithmemory abletoperformlogic
operationsandstoreinformation.
HistoryofDevelopment(Continued)
• Memristor Theory(1971)– regulatesflowof
electricalcurrentincircuitandrememberscharge
thatpreviouslyflowed;retainmemorywithout
power(non-volatile)
• Very-large-scaleintegration(VLSI)(1980)–integrated
circuitcreatedbycombiningthousandsoftransistors
intosinglechip
– HelpedNeuralNetworks(Hopfield),Neuromorphic
Engineering(Mead),andPhysicsofComputation
(Feynman)
AHaH Computation
• Exploitcontroversial“FourthLawof
Thermodynamics”toretrainweightsfrom
Swenson(1989):
– “Asystemwillselectthepathorassemblyofpathsout
ofavailablepathsthatminimizesthepotentialor
maximizestheentropyatthefastestrategiventhe
constraints.”
• Thisthermodynamicprocesscanbeunderstood:
1)Particlesspreadoutalongavailablepathwaysanderode
differentials thatfavoronepathortheother
2)Pathwaysthatleadtodissipationarestabilized
Figure1.AHaHprocess.
NugentMA,MolterTW(2014)AHaHComputing–FromMetastableSwitchestoAttractorstoMachineLearning.PLoSONE9(2):e85175.
doi:10.1371/journal.pone.0085175
http://journals.plos.org/plosone/article?id=info:doi/10.1371/journal.pone.0085175
AHaH Plasticity
• Definesynapticweight:
w=Ga – Gb,whereGa andGb areconductancevariables
• Anti-Hebbian (erasepath)
– Modificationtosynapticweightthatreducesprobability
thatthesynapticstatewillremainthesameupon
subsequent measurement
• Hebbian (selectpath)
– Modificationtosynapticweightthatincreasesprobability
thatsynapticstatewillremainthesameuponsubsequent
measurement
Figure4.Adifferentialpairofmemristorsformsasynapse.
NugentMA,MolterTW(2014)AHaHComputing–FromMetastableSwitchestoAttractorstoMachineLearning.PLoSONE9(2):e85175.
doi:10.1371/journal.pone.0085175
http://journals.plos.org/plosone/article?id=info:doi/10.1371/journal.pone.0085175
LinearNeuronModel
• y=b+ΣNi=0xiwi
– yisthe output ofthe selected neuron,bisthe bias to the
function,xi isthe output ofneuron i,and wi isthe weight
between the selected neuron and neuron xi
• Weights and biases change based onthe AHaH rule
– αand βare constants that control the relative contribution
ofAnti-Hebbian and Hebbian plasticity
AHaH forMachineLearning
• AHaH node:usesAHaH ruletobisectinput
spaceusingahyperplane tomakeabinary
decision
• SincetherearemanywaystheAHaH nodecan
bisectinput,thehyperplane chosentobisect
theinputspaceis“theonethatmaximizesthe
separationofthetwoclasses”ofinputs.
Figure2.Attractorstatesofatwo-inputAHaHnode.
NugentMA,MolterTW(2014)AHaHComputing–FromMetastableSwitchestoAttractorstoMachineLearning.PLoSONE9(2):e85175.
doi:10.1371/journal.pone.0085175
http://journals.plos.org/plosone/article?id=info:doi/10.1371/journal.pone.0085175
AHaH PowerConsumption
• MajormotivationistoeliminatevonNeumann
bottleneckregardingmemory-processor
communication
• Energyoutputisdefined:
• Inpractice,theenergyconsumptionpersynapse
perupdateisaround12pJ,althoughscaling
trendsprojectenergyconsumptiontobelowered
to2pJ inthefuture.
FutureGoals
• Provideaphysicaladaptivelearninghardware
resource(AHaH circuit)thatprovidesmemory
similartohowRAMprovidesmemoryto
currentcomputingsystems.
• UseAHaH nodeasabuildingblockforhigherorderadaptivealgorithmsnotyetconceived.
Conclusion
• ArtificialNeuralNetworksaregreatfunction
approximations,butthereisamajorbottleneck
frommemory-processorcommunicationusing
vonNeumannarchitecture
• Reachingbacktothelate1940s,thistypeof
computationhasbeentheorizedandstudied
• AHaH computationexploitsphysical
phenomenontoimplementanartificialneural
network,lesseningthememory-processor
communicationbottleneckforpowerwhile
allowingmassiveparallelism.
Sources
• AHaH Computing-FromMetastableSwitchestoAttractorsto
MachineLearning
– http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0085175
• “NeuralNetworkLearning”
– http://web.eecs.utk.edu/~mclennan/Classes/420-527-S14/handouts/Part-2A1pp.pdf
• “Neuralnetworkmodelsforpatternrecognitionand
associativememory”
– http://www.sciencedirect.com/science/article/pii/089360808990035X
• “Perceptrons,Adalines,andBackpropagation”
– http://www-isl.stanford.edu/~widrow/papers/bc1995perceptronsadalines.pdf