Advertisement

Neural Network Design for Efficient Information Retrieval

  • L. Personnaz
  • I. Guyon
  • G. Dreyfus
Conference paper
Part of the NATO ASI Series book series (volume 20)

Abstract

The ability of neural networks to store and retrieve information has been investigated for many years. A renewed interest has been triggered by the analogy between neural networks and spin glasses which was pointed out by W.A. Little et al.1 and J. Hopfield2. Such systems would be potentially useful autoassociative memories “if any prescribed set of states could be made the stable states of the system”2; however, the storage prescription (derived from Hebb’s lav/) which was used by both authors did not meet this requirement, so that the information retrieval properties of neural networks based on this law were not fully satisfactory. In the present paper, a generalization of Hebb’s law is derived so as to guarantee, under fairly general conditions, the retrieval of the stored information (autoassociative memory). Illustrative examples are presented.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. (1).
    W.A. Little, G.L. Shaw, Math. Biosciences 39, 281 (1978).CrossRefzbMATHMathSciNetGoogle Scholar
  2. (2).
    J. J. Hopfield, Proc. Natl. Acad. Sci. USA 79, 2554 (1982).CrossRefMathSciNetGoogle Scholar
  3. (3).
    L. Personnaz, I. Guyon, G. Dreyfus, J. Physique Lett. 46, L-359 (1985).CrossRefGoogle Scholar
  4. (4).
    A. Albert, Regression and the Moore-Penrose pseudoinverse (Academic Press, 1972).zbMATHGoogle Scholar
  5. (5).
    T.N.E. Greville, Siam Rev. 2, 5 (1960).CrossRefMathSciNetGoogle Scholar
  6. (6).
    To be published.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1986

Authors and Affiliations

  • L. Personnaz
    • 1
  • I. Guyon
    • 1
  • G. Dreyfus
    • 1
  1. 1.Laboratoire d’ElectroniqueE.S.P.C.I.ParisFrance

Personalised recommendations