The method of conterpropagation has been developed by Hetch-Nielsen [HN87] in 1988. Its principal characteristic is to reduce considerably the learning time. It is a combination of
The combination of the two methods has characteristics that they do not have separately.
Figure 34: Network of conterpropagation
The layer of Kohonen is performed following information presented in the previous pages.
Using an input vector, the layer of Grossberg provides to each output the weight value which connects each neuron to only one active neuron of the layer of Kohonen. During the learning these values are computed as follows:
: output of the neuron I of Kohonen (only active neuron).
: component J of the vector of the wanted output.
Initially
is set at 0.1, then this value is reduced during the learning.
This method can be modified to produce a result which combines several neurons of the layer of Kohonen. It deals with a group of neurons which obtain the best scores. The running is then interpolated.
The author of conterpropagation specifies that in the majority of cases this method is less efficient than the backpropagation. However its simplicity and its low computing time can be interesting for applications.