• Rapid Communication

Neural networks with nonlinear synapses and a static noise

H. Sompolinsky
Phys. Rev. A 34, 2571(R) – Published 1 September 1986
PDFExport Citation

Abstract

The theory of neural networks is extended to include a static noise as well as nonlinear updating of synapses by learning. The noise appears either in the form of spin-glass interactions, which are independent of the learning process, or as a random decaying of synapses. In an unsaturated network, the nonlinear learning algorithms may modify the energy surface and lead to interesting new computational capabilities. Close to saturation, they act as an additional source of a static noise. The effect of the noise on memory storage is calculated.

  • Received 11 June 1986

DOI:https://doi.org/10.1103/PhysRevA.34.2571

©1986 American Physical Society

Authors & Affiliations

H. Sompolinsky

  • Department of Physics, Bar-Ilan University, Ramat-Gan, Israel 52100

References (Subscription Required)

Click to Expand
Issue

Vol. 34, Iss. 3 — September 1986

Reuse & Permissions
Access Options
Author publication services for translation and copyediting assistance advertisement

Authorization Required


×
×

Images

×

Sign up to receive regular email alerts from Physical Review A

Log In

Cancel
×

Search


Article Lookup

Paste a citation or DOI

Enter a citation
×