|
Monday, October 13, 2008 |
Activity 20: Neural Networks |
Pattern recognition simply is making the computer recognize patterns we recognize visually. This begs the question: why not model pattern recognition the way our brain works? And this is what neural networks is about. Neural networks are composed of neurons arranged in a layer that is fully connected with the preceding layer. The layers are called the input, hidden and output layers respectively. The input layer is where the input features are fed and directed to the hidden layer which is again directed to the output layer. Neurons pass information using an "all-or-nothing" approach which is dependent on the strength of its connection to adjacent neurons.The tansig function is one of the most often used activation functions is pattern recognition problems.
For neural networks I used MATLAB instead of Scilab since MATLAB has an inbuilt Neural Networks toolbox while Scilab needs ANN. I used the parameters from Activity 18 for the 4 classes of images which are: Area, Height over Width, Average R, and Average G. Below is the code I used:
training = [5345 1.000000 0.758862 0.129754 6128 0.958763 0.756430 0.136960 5541 1.011494 0.814644 0.108821 1239 1.076923 0.602028 0.283912 1224 1.078947 0.629195 0.280192 1206 1.108108 0.611142 0.289349 439 0.444444 0.253336 0.508687 368 0.304348 0.248109 0.521104 372 0.444444 0.255933 0.518223 2080 1.000000 0.520683 0.374020 2064 1.040000 0.529217 0.382716 1977 1.083333 0.582884 0.345742 ]'; test = [ 5421 1.000000 0.795391 0.112691 5162 0.932584 0.826950 0.100648 5281 1.098765 0.788015 0.122597 1264 1.051282 0.633988 0.271862 1179 1.105263 0.618832 0.278147 1191 1.105263 0.615691 0.280110 306 0.409091 0.243423 0.517028 297 0.304348 0.245041 0.518146 304 0.239130 0.252019 0.514459 2000 0.980000 0.499452 0.372944 1956 1.020408 0.525111 0.379350 1906 1.062500 0.522208 0.392979 ]';
training_out = [0 0; 0 0; 0 0; 0 0; 0 1; 0 1; 0 1; 0 1; 1 1; 1 1; 1 1; 1 1]';
net=newff(minmax(training),[25,2],{'tansig','tansig'},'traingd'); net.trainParam.show = 50; net.trainParam.lr = 0.01; net.trainParam.lr_inc = 1.05; net.trainParam.epochs = 1000; net.trainParam.goal = 1e-2; [net,tr]=train(net,training,training_out); a = sim(net,training); a = 1*(a>=0.5); b = sim(net,test); b = 1*(b>=0.5); diff = abs(training_out - b); diff = diff(1,:) + diff(2, :); diff = 1*(diff>=1);
Again, 100% classification was observed.
Acknowledgments
Thanks to Jeric for helping me with the code. I understood this activity! i give myself 10/10 neutrinos!
|
posted by poy @ 2:43 AM |
|
1 Comments: |
-
I'm really getting the hang of this, Paul! Haha
|
|
<< Home |
|
|
|
|
|
|
I'm really getting the hang of this, Paul! Haha