Download A Parallel Genetic Algorithm For Tuning Neural Networks

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Zero-configuration networking wikipedia , lookup

Network tap wikipedia , lookup

Airborne Networking wikipedia , lookup

List of wireless community networks by region wikipedia , lookup

Transcript
DePauw University
Scholarly and Creative Work from DePauw University
Science Research Fellows Posters
Student Work
11-2014
A Parallel Genetic Algorithm For Tuning Neural
Networks
Nathan Chadderdon
Knox College
Ben Harsha
DePauw University
Steven Bogaerts
DePauw University
Follow this and additional works at: http://scholarship.depauw.edu/srfposters
Part of the Computer Sciences Commons
Recommended Citation
Chadderdon, Nathan, Ben Harsha, Steven Bogaerts. "A Parallel Genetic Algorithm For Tuning Neural Networks." Poster presented at
the 2014 Science Research Fellows Poster Session, Greencastle, IN, November 2014.
This Poster is brought to you for free and open access by the Student Work at Scholarly and Creative Work from DePauw University. It has been
accepted for inclusion in Science Research Fellows Posters by an authorized administrator of Scholarly and Creative Work from DePauw University.
For more information, please contact [email protected].
A Parallel Genetic Algorithm
For Tuning Neural Networks
INT--)
36”x48”
ble time
s.
dd new
er onto
t.
aceholder
header.
concepts
o the
oster, size
o the
book page.
the FB icon.
Abstract0
Genes0
One" challenge" in" using" ar.ficial" neural" networks" is" how" to" determine"
appropriate"parameters"for"network"structure"and"learning."OIen"parameters"
such" as" learning" rate" or" number" of" hidden" units" are" set" arbitrarily" or" with" a"
general"“intui.on""as"to"what"would"be"most"effec.ve."The"goal"of"this"project"
is" to" use" a" gene.c" algorithm" to" tune" a" popula.on" of" neural" networks" to"
determine" the" best" structure" and" parameters." This" paper" considers" a" gene.c"
algorithm"to"tune"the"number"of"hidden"units,"learning"rate,"momentum,"and"
number" of" examples" viewed" per" weight" update." Experiments" and" results" are"
discussed" for" two" domains" with" dis.nct" proper.es," demonstra.ng" the"
importance" of" careful" tuning" of" network" parameters" and" structure" for" best"
performance."
Gene.c" algorithms" use" genes" to" determine" the" traits" of" the" individuals"
neural"networks"in"the"popula.on."In"this"experiment"there"were"four"genes"
describing"the"different"network"setups:"
•  Hidden0Units"\"the"number"of"neurons"in"the"hidden"layer"of"the"neural"
network"(Fig."1)"
•  Batch0 Size" \" the" number" of" examples" to" be" considered" during" a" weight"
update"
•  Learning0Rate"\"the"speed"at"which"the"network"learns"
•  Momentum" \" the" percentage" of" the" previous" weight" update" used" in"
calcula.on"for"the"new"weight"update"
670
Hidden"Units"
Setup0
The"first"stage"of"this"
project"was"the"crea.on"of"
a"mul.\layer"neural"
network,"a"common"
machine"learning"structure"
based"on"biological"
neurons"and"the"
connec.ons"between"
them."This"network"was"
designed"to"work"with"a"
variety"of"domains."
0.3060
0.3200
Learning"Rate"
Momentum"
Max0Batch0Size0
50"
100"
150"
Due"to"the"fact"that"
both"neural"networks"
and"gene.c"
algorithms"have"long"
run".mes,"parallel"
programming"was"
implemented"using"
OpenMP"to"increase"
the"speed"of"
computa.on.""
Two"sec.ons"were"parallelized,"the"first"being"the"training"calcula.ons"for"the"
network" and" the" second" being" the" crea.on" of" individuals" in" the" gene.c"
algorithm."Both"methods"produced"similar"speed"increases."
Below050%0of0Max0
40"
17"
7"
Batch"size"had"a"tendency"to"seZle"on"values"that"were"low"rela.ve"to"their"
given" maximums." This" trend" held" true" for" both" domains," with" each" giving"
very"similar"values"for"batch"size."
Fig"2."A"sample"genome"taken"from"the"gene.c"algorithm"
Momentum0
Learning0Rate0
Momentum0vs0Average0Error0in0Cars0Domain0
0.08"
0.06"
0.04"
0.02"
0"
Parallelism0
Experiments0Run0
54"
17"
8"
Fig"5."Number"of"experiments"that"seZled"below"50%"of"
their"maximum"batch"size"
0.1"
Next,"a"method"was"required"to"explore"which"network"setups"performed"
beZer." " In" this" case" a" gene.c" algorithm," which" mimics" the" process" of"
evolu.on,"was"used."Tests"were"run"by" "inpu_ng"certain"restraints"and"
allowing" the" gene.c" algorithm" to" " op.mize" the" " networks." The" best"
network"from"each"test"was"recorded"for"analysis."
www.PosterPresentations.com
200
Batch"Size"
Batch0Size0Trends0in0Cars0Domain0
Learning0Rate0vs0Average0Error0in0Cars0Domain0
Fig"1."A"mul.\layer"Neural"Network"
RESEARCH POSTER PRESENTATION DESIGN © 2011
Batch0Size0
Average0Error0
oster call
9.3004
2.  Knox"College,"Galesburg,"IL"
[email protected]"
1.  DePauw"University"Greencastle,"IN"
{benjaminharsha_2015,"stevenbogaerts}@depauw.edu"
Average0Error0
owser).
This Power
(version 20
commonly
If you are u
template fe
Nathan"Chadderdon2,"Ben"Harsha1,"Steven"Bogaerts1"
end it to
ality, same
at will
ss and
(--THI
0"
1"
2"
3"
Learning0Rate0
Fig"3."Effect"of"Learning"Rate"on"Average"Error"
4"
The" Cars" domain" preferred" to" use" higher" learning" rate." This" contradicts"
previous" assump.ons" that" small" learning" rates" around" 0.1" or" 0.3" were"
op.mal.""The"Splice"domain"behaved"as"expected,"and"selected"low"learning"
rates"in"the"0.1\0.3"range."
Number0of0Hidden0Units0
Hidden0Units0Trends0in0Cars0Domain0
Max0Hidden0Units0
30"
50"
100"
150"
Experiments0Run0
46"
8"
17"
8"
Below050%0of0Max0
34"
7"
13"
7"
Fig"4."Number"of"Experiments"that"seZled"below"50%"of"their"
maximum"number"of"hidden"units"
The"number"of"hidden"units"preferred"to"seZle"on"low"values"rela.ve"to"the"
given" maximums." This" trend" con.nued" in" the" Splice" domain," although" the"
range"of"values"chosen"in"that"domain"was"slightly"larger."
0.1"
0.09"
0.08"
0.07"
0.06"
0.05"
0.04"
0.03"
0.02"
0.01"
0"
0"
0.2"
0.4"
0.6"
Momentum0
0.8"
1"
Fig"6."Effect"of"Momentum"on"Average"Error"
Momentum"did"not"affect"results"for"either"domain."The"values"chosen"were"
seemingly" random" and" there" was" no" trend" that" indicated" a" posi.ve" or"
nega.ve"effect"on"the"average"error."
Conclusions0
•  Learning0 Rate0 –" Preference" for" learning" rate" depended" strongly" on" the"
domain"
•  Hidden0Units0–"Tended"to"seZle"on"very"low"values"for"Cars"domain,"slightly"
higher"values"for"Splice"domain"
•  Batch0Size0–"Tended"to"seZle"on"low"values"for"both"domains"
•  Momentum"–"No"effect"on"the"results"
Acknowledgement0
This project was funded by the National Science Foundation.
Grant number: CNS-1156893
Verifying t
Go to the V
preferred m
the size of
be printed
poster will
100% and e
before you
Using the p
To add text
placeholde
a placehold
your cursor
to this sym
its new loc
Additional
side of this
Modifying
This templa
different c
Right-click
on the back
click on “L
the layout
The column
cannot be m
layout by g
Importing
TEXT: Past
placeholde
left side of
needed.
PHOTOS: D
click in it a
TABLES: Yo
external do
adjust the
table that
click FORM
change the
Modifying
To change t
the “Design
choose from
can create
©"2011"Po
""""2117"Fou
""""Berkeley
""""posterpr