Inhalt anspringen

Improving Leung's bidirectional learning rule for associative memories

Schnelle Fakten

  • Interne Autorenschaft

  • Veröffentlichung

    • 2001
  • Zeitschrift/Zeitung

    IEEE Transactions on Neural Networks (5)

  • Organisationseinheit

  • Fachgebiete

    • Angewandte Informatik
  • Format

    Journalartikel (Artikel)

Zitat

B. Lenze, “Improving Leung’s bidirectional learning rule for associative memories,” IEEE Transactions on Neural Networks, vol. 12, no. 5, pp. 1222–1226, 2001.

Abstract

Leung (1994) introduced a perceptron-like learning rule to enhance the recall performance of bidirectional associative memories (BAMs). He proved that his so-called bidirectional learning scheme always yields a solution within a finite number of learning iterations in case that a solution exists. Unfortunately, in the setting of Leung a solution only exists in case that the training set is strongly linear separable by hyperplanes through the origin. We extend Leung's approach by considering conditionally strong linear separable sets allowing separating hyperplanes not containing the origin. Moreover, we deal with BAMs, which are generalized by defining so-called dilation and translation parameters enlarging their capacity, while leaving their complexity almost unaffected. The whole approach leads to a generalized bidirectional learning rule which generates BAMs with dilation and translation that perform perfectly on the training set in a case that the latter satisfies the conditionally strong linear separability assumption. Therefore, in the sense of Leung, we conclude with an optimal learning strategy which contains Leung's initial idea as a special case.

Erläuterungen und Hinweise

Diese Seite verwendet Cookies, um die Funktionalität der Webseite zu gewährleisten und statistische Daten zu erheben. Sie können der statistischen Erhebung über die Datenschutzeinstellungen widersprechen (Opt-Out).

Einstellungen (Öffnet in einem neuen Tab)