Acessibilidade / Reportar erro

Symbolic processing in neural networks

Abstract

In this paper we show that programming languages can be translated into recurrent (analog, rational weighted) neural nets. Implementation of programming languages in neural nets turns to be not only theoretical exciting, but has also some practical implications in the recent efforts to merge symbolic and sub symbolic computation. To be of some use, it should be carried in a context of bounded resources. Herein, we show how to use resource bounds to speed up computations over neural nets, through suitable data type coding like in the usual programming languages. We introduce data types and show how to code and keep them inside the information flow of neural nets. Data types and control structures are part of a suitable programming language called NETDEF. Each NETDEF program has a specific neural net that computes it. These nets have a strong modular structure and a synchronization mechanism allowing sequential or parallel execution of subnets, despite the massive parallel feature of neural nets. Each instruction denotes an independent neural net. There are constructors for assignment, conditional and loop instructions. Besides the language core, many other features are possible using the same method.

Neural Networks; Neural Computation; Symbolic Processing; NETDEF


Full text available only in PDF format

REGULAR PAPER

Symbolic processing in neural networks

João Pedro NetoI; Hava T. SiegelmannII; J.Félix CostaIII

IFaculdade de Ciências, Dept. Informática, BlocoC5,Piso 1,1700, Lisboa - Portugal, jpn@di.fc.ul.pt IIFaculty of Industrial Engineering and Management, TECHNION CITY, HAIFA 32 000-Israel, iehava@ie.technion.ac.il

IIIInstituto Superior Técnico, Dept. Matemática, Av. Rovisco Pais, 1049-001 Lisboa - Portugal fgc@math.ist.utl.pt

ABSTRACT

In this paper we show that programming languages can be translated into recurrent (analog, rational weighted) neural nets. Implementation of programming languages in neural nets turns to be not only theoretical exciting, but has also some practical implications in the recent efforts to merge symbolic and sub symbolic computation. To be of some use, it should be carried in a context of bounded resources. Herein, we show how to use resource bounds to speed up computations over neural nets, through suitable data type coding like in the usual programming languages. We introduce data types and show how to code and keep them inside the information flow of neural nets. Data types and control structures are part of a suitable programming language called NETDEF. Each NETDEF program has a specific neural net that computes it. These nets have a strong modular structure and a synchronization mechanism allowing sequential or parallel execution of subnets, despite the massive parallel feature of neural nets. Each instruction denotes an independent neural net. There are constructors for assignment, conditional and loop instructions. Besides the language core, many other features are possible using the same method. There is also a NETDEF compiler, available at http://www.di.fc.ul.pt/~jpn/netdef/netdef.htm.

Keywords: Neural Networks, Neural Computation, Symbolic Processing, NETDEF.

  • Interactive Software Engineering Inc., Eiffel®: the Language. TR-EI 17/RM, 1989.
  • F. Gruau, J. Ratajszczak, and G. Wiber. A Neural Compiler. Theoretical Computer Science 141 (1 2), 1-52,1995.
  • B. Lester. The Art of Parallel Programming. Prentice Hall, 1993.
  • W. McCulloch and W. Pitts. A Logical Calculus of the Ideas Immanent in Nervous Activity. Bulletin of Mathematical Biophysics 5,115-133,1943.
  • M. Minsky. Computation: Finite and Infinite Machines. Prentice Hall, 1967.
  • J. P. Neto, H. Siegelmann, J. F. Costa, and C. S. Araujo. Turing Universality of Neural Nets (revisited). Lecture Notes in Computer Science 1333, Springer-Verlag, 361-366,1997.
  • J. P. Neto, H. Siegelmann, and J. F. Costa, On the Implementation of Programming Languages with Neural Nets, First International Conference on Computing Anticipatory Systems, CASYS 97, CHAOS, [1], 201-208, 1998.
  • H. Siegelmann. Foundations of Recurrent Neural Networks. Technical Report DCS-TR-306, Department of Computer Science, Laboratory for Computer Science Research, The State University of New Jersey Rutgers, October 1993.
  • H. Siegelmann. On NIL: The Software Constructor of Neural Networks. Parallel Processing Letters 6(4), World Scientific Publishing Company, 575-582,1996.
  • H. Siegelmann. Neural Networks and Analog Computation, Beyond the Turing Limit. Birkhauser, 1999.
  • SGS-THOMSON, Occam® 2.1 Reference Manual, 1995.
  • E. Sontag. Mathematical Control Theory: Deterministic Finite Dimensional Systems. Springer-Verlag, New York, 1990.
  • United States Department of Defence. Reference Manual for the Ada® Programming Language. American National Standards Institute Inc., 1983.

Publication Dates

  • Publication in this collection
    16 Sept 2004
  • Date of issue
    Apr 2003
Sociedade Brasileira de Computação Sociedade Brasileira de Computação - UFRGS, Av. Bento Gonçalves 9500, B. Agronomia, Caixa Postal 15064, 91501-970 Porto Alegre, RS - Brazil, Tel. / Fax: (55 51) 316.6835 - Campinas - SP - Brazil
E-mail: jbcs@icmc.sc.usp.br