Backpropagation .*;

Example 1: The XOR Problem

This is an implementation of backpropagation to solve the classic XOR problem. Note the additional input node for bias.

 

Example Results

The reason why I print out these long lists of numbers is so I can watch the error levels and weight values progress. What I want to see is weight values changing to "imitate" the input values, and error levels steadily decreasing.

The end result shows the algorithm making approximate matches between actual with expected ("model").

 

Weight = 0.0332778215408325
Weight = 0.0673746705055237
Weight = 0.0097496509552002
Weight = 0.0339824438095093
Weight = -0.00304974317550659
Weight = 0.075954532623291
Weight = -0.0314040899276733
Weight = 0.0599555611610413
Weight = -0.0635984897613525
Weight = -0.0470492601394653
Weight = -0.0786069750785828
Weight = -0.0414167404174805

epoch = 0 RMS Error = 1.00828546088439
epoch = 1 RMS Error = 1.04431756324516
epoch = 2 RMS Error = 1.06694331123151
epoch = 3 RMS Error = 0.99282299209343
epoch = 4 RMS Error = 0.985786944458554
epoch = 5 RMS Error = 0.989251619888048
epoch = 6 RMS Error = 1.05938566542057
epoch = 7 RMS Error = 1.02850842638726
epoch = 8 RMS Error = 1.07480413385433
epoch = 9 RMS Error = 0.958273014469799

Abbreviated results...

epoch = 194 RMS Error = 5.9204799097716E-05
epoch = 195 RMS Error = 6.25204206876643E-05
epoch = 196 RMS Error = 5.18000995447541E-05
epoch = 197 RMS Error = 5.16330207454197E-05
epoch = 198 RMS Error = 4.43953577070971E-05
epoch = 199 RMS Error = 3.85723159865368E-05
epoch = 200 RMS Error = 3.74542897223701E-05

pat = 1 actual = 1 neural model = 0.999949817462897
pat = 2 actual = 1 neural model = 0.99998176898266
pat = 3 actual = -1 neural model = -0.999998139106047
pat = 4 actual = -1 neural model = -0.999947491188123

 

public void footer() {
About | Contact | Privacy Policy | Terms of Service | Site Map
Copyright© 2009-2012 John McCullock. All Rights Reserved.
}