Download BreesePresentationQ3..

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Central pattern generator wikipedia , lookup

Corecursion wikipedia , lookup

Hardware random number generator wikipedia , lookup

Secure multi-party computation wikipedia , lookup

Gene expression programming wikipedia , lookup

Artificial neural network wikipedia , lookup

Backpropagation wikipedia , lookup

Pattern recognition wikipedia , lookup

Types of artificial neural networks wikipedia , lookup

Transcript
Artificial Neural Networks
for Pattern Recognition
Jack Breese
Computer Systems
Quarter 3, Pd. 7
What is a Neural Network?



Interconnected neurons
Weights
Output
Uses of Neural Networks

Pattern Recognition


Face Recognition
OCR
Neurons



Add up each weighted input
Use an activation function to determine output
Pass on output to next layer
Training Neural Networks


Large input set
Outputs are verified, weights adjusted along a
gradient based on these results.
Program Information



Neural Network Library written in C
Currently capable of initializing a two-layer
perceptron with working, but unweighted
connections.
Can load images up to 500x500 pixels in size.
Data Structure
typedef struct _connection {
float weight;
struct _neuron * from;
} connection;
typedef struct _neuron { //TODO: Implement a neuron which
supports connections.
float d;
connection * cons;
}neuron;
neuron* mkneuron(int c) {
neuron* n = malloc(sizeof(neuron));
n->d = 0;
connection * a = malloc(c*sizeof(connection));;
n->cons = a;
return n;
}
New Progress




Saving and Loading of Weights
New Data Structures
Optimization
Testing
Saving and Loading Weights
int saveWeights(char* filename, neuron* hidden, neuron* outputs, int insize, int hiddensize, int outsize) {
FILE* output;
output = fopen(filename, "wb");
if(output == NULL){
fprintf(stderr, "Error: Unable to open output file for writing.");
exit(1);
}
fprintf(output, "%d\n", insize);
fprintf(output, "%d\n", hiddensize);
fprintf(output, "%d\n", outsize);
//
fprintf(output, "%s\n", "[Hidden Layer Weights]");
int i = 0;
int j = 0;
for(; j < hiddensize; j++) {
for(i=0; i < insize; i++) {
fprintf(output, "%f\n", hidden[j].cons[i].weight);
}
}
//
fprintf(output, "%s\n", "[Output Layer Weights]");
i = 0;
j = 0;
for(; i<outsize; i++){
for(j=0; j<hiddensize; j++){
fprintf(output, "%f\n", outputs[i].cons[j].weight);
}
}
fprintf(output, "%s\n", "[End]");
fclose(output);
return 0;
}
Saving and Loading, Cont.

File Format



Stores size of each layer
Stores weight of each connection in order
Loading

Just like saving, but backwards.
New Data Structures

A netsize struct


Contains three ints
Makes keeping track of the network size cleaner
typedef struct _netsize {
int insize;
int hiddensize;
int outsize;
}netsize;
Testing

Memory Usage was tested

See next slide.
Problems Encountered

Initially thought memory usage was low.

Forgot to reset counter in nested for loops to 0.


That was dumb.
Corrected problem, memory usage went up

Decided to scale back network size/interconnectedness
4th Quarter Goals

Perform simpler recognition tasks on smaller
images due to memory/processor constraints




Simple face recognition
Optical character recognition
Shape recognition
Grocery store produce recognition