Abstract
In this paper the authors discuss several complexity aspects pertaining to neural networks, commonly known as the curse of dimensionality. The focus will be on: (1) size complexity and depth-size tradeoffs; (2) complexity of learning; and (3) precision and limited interconnectivity. Results have been obtained for each of these problems when dealt with separately, but few things are known as to the links among them. They start by presenting known results and try to establish connections between them. These show that they are facing very difficult problems--exponential growth in either space (i.e. precision and size) and/or time (i.e., learning and depth)--when resorting to neural networks for solving general problems. The paper will present a solution for lowering some constants, by playing on the depth-size tradeoff.
Original language | English |
---|---|
Title of host publication | International Conference on Control System and Computer Science |
Publication status | Published - May 28 1997 |
Event | CSCS-11 - Bucharest, Romania Duration: May 28 1997 → … |
Conference
Conference | CSCS-11 |
---|---|
Period | 5/28/97 → … |