Back Propagation Algorithm
To start off, the parameters of the neural network are arbitrary. We have with us a set of
sample input vector-desired output pairs. At first, these examples are fed to the network
one after the other and in each case the network adjusts its parameters as explained
below. After all pairs are presented once, they are presented again in a shuffled order,
then yet again on another order.......till the performance is found satisfactory(this will be
elaborated on later) . Below, we explain what is done for each sample presented to the
network.

Say some input sample is fed to the network. Let the output of the k'thneuron in the
output layer be yk and let the desired response of this k'thneuron be dk . We define the
error ek to be:
ek = dk - yk



|