João Pedro Neto's
PhD work
This work lies within the scientific areas of Theory of Computation and artificial neural networks. It researches some possible knowledge bridges between both areas, and tries to integrate concepts in order to achieve a common and broader computational framework.
This effort concentrates, firstly, on a computational architecture definition, based on a certain neural model (and subsequent demonstration that its power is equivalent to Turing Machines). A working compiler that translates the demonstration of these computational paradigms is presented here. There is also a tool called NETWAY which is able to simulate this type of neural networks with arbitrary precision.
Secondly, it was developed a set of working tools to maximize and take advantage of this computational model. This was achieved by using a high level programming language, and an automatic compilation process, able to translate the algorithmic representation of a certain problem into a parallel and modular neural network. This language is called NETDEF and there is a compiler available. These tools focus on two computational concepts: control and learning. In this context, control means all algorithms that use symbolic information, i.e., information with a well-defined context and meaning. Learning, on the other side, consists in a set of sub-symbolic algorithms, where there is no individual meaning for each basic piece of information, and all knowledge is distributed.
New: There is a Java program (a runnable jar) that translates neural nets of this model into a set of triples that can be executed in parallel. This translator plus a simulator for this triple processing is available.
Note: Unless stated otherwise, all software tools are for Windows™ only.
J. Neto, J. F. Costa, P. Carreira, and M. Rosa. A compiler and simulator for partial recursive functions over neural networks, in Ahmad Lotfi and Jonathon M. Garibaldi (eds), Applications and Science in Soft Computing, Springer-Verlag, 2004, in print. Download it in pdf format.
J. Neto, H. Siegelmann, J. Costa. Symbolic Processing in Neural Networks, Journal of Brazilian Computer Science, 8(3), 58-70, 2003. Download it in gziped postscript or pdf format.
J. Neto and H. Coelho. A
distributed search (best-path) algorithm based on local communication,
International Conference on Intelligent Agents, Web Technologies and
Internet Commerce (IAWTIC'2003), 377-384, 2003.
J. Neto, H. Coelho, A. Ferreira. Competitive Learning and Symbolic Computation Integration, 5th Brazilian Congress of Neural Nets, 19-24, 2001.
J. Neto, J. Costa, A. Ferreira. Merging Sub-symbolic and Symbolic Computation, Proceedings of the Second International ICSC Symposium on Neural Computation NC'00 (Bothe, H. and Rojas, R., Eds.), ICSC Academic Press, (2000), 329-335.. Download it in gziped postscript or pdf format.
J. Neto, J. Costa, Building Neural Net Software, Technical Report99-5, 1999. Download it on pdf format.
J. Neto, H. Siegelmann, J. Costa, Information Coding and Neural Computing, Advances in A.I. and Engineering Cybernetics, Vol. IV: Systems Logic & Neural Networks, IIAS, [10], 1998, 76-80. Download it on pdf format.
J. Neto, H. Siegelmann, J. Costa. On the Implementation of Programming Languages with Neural Nets. In First International Conference on Computing Anticipatory Systems, CASYS 97, [1], 201-208, CHAOS, 1998. Download it on pdf format.
J. Neto, H. Siegelmann, J. Costa, C. Araújo. Turing Universality of Neural Nets (revisited). Lecture Notes in Computer Science 1333 , 361-366, Springer-Verlag, 1997. Download it in gziped postscript or pdf format.
J. Neto, J. Costa, H. Coelho, Lower Bounds of a Computacional Power of a Synaptic Calculus. In Lecture Notes in Computer Science - 1240, 340-348, Springer-Verlag, 1997. Download it in gziped postscript or pdf format.
The thesis is also available in pdf (however, it is in Portuguese).
The NETDEF neural networks are synchronous, i.e., they need a global clock to update at the same time. Is it possible to devise an architecture able to support asynchronous update? That would mean that each neuron would process in its own time, approaching the apparent dynamics of the neural central system of real animals.
Non homogenous activation functions seems another direction to expand NETDEF. The saturated sigmoid used is appropriate for symbolic computation and some learning functions are implementable (like Hebb or Competitive Learning). Others, like Backpropagation are not (at least in trivial ways). The insertion of other activation functions (like the sigmoid) can provide expedite ways to enlarge the scope of available learning functions.
Including dynamical networks, i.e., creation and destruction of neurons over time, can be the major step to include both recursion over symbolic functions, and the use of learning functions that use that concept (like ART networks)
Improving the integration of symbolic and subsymbolic computation. This integration, made upon NETDEF, where is possible to define complex symbolic tasks and assign them to neural nets, this next step created interaction between this unusual way to use neural nets with the more standard methods of neural nets, namely, learning and classification. There are two main goals: (a) interface definition, i.e., how symbolic and subsymbolic modules can interact and communicate, and (b) how learning may be conducted in neural nets using only neural nets, i.e., instead of thinking of a neural net as a special data structure in a von Neumann computer. The goal was to find a neural net description (a set of discrete dynamic equations) that can learn but also is able to control the learning process. Future steps may include:
May 2002