Jump to content

On a perceptron-type learning rule for higher order Hopfield neural networks including dilation and translation

Fast facts

Quote

B. Lenze, “On a perceptron-type learning rule for higher order Hopfield neural networks including dilation and translation,” Neurocomputing, vol. 48, no. 1–4, pp. 391–401, 2002.

Content

In the following, we will introduce a new Perceptron-like learning rule to enhance the recall performance of higher order Hopfield neural networks without significant increase of their complexity. In detail, our approach will lead to a generalized Perceptron learning rule which generates higher order Hopfield neural networks with dilation and translation that perform perfectly on the training set in case that the latter fulfills the so-called conditionally strong Γ-separability condition. In this sense, our learning scheme satisfies a kind of optimality criterion which means that it finds appropriate network parameters in a finite number of learning cycles in case that a solution exists.

Notes and references

This site uses cookies to ensure the functionality of the website and to collect statistical data. You can object to the statistical collection via the data protection settings (opt-out).

Settings(Opens in a new tab)