Abstract
The paper overviews results dealing with the approximation capabilities of neural networks, as well as bounds on the size of threshold gate circuits. Based on an explicit numerical (i.e., constructive) algorithm for Kolmogorov's superpositions we show that for obtaining minimum size neural networks for implementing any Boolean function, the activation function of the neurons is the identity function. Since classical AND-OR implementations, as well as threshold gate implementations which require exponential size (in the worst case), it follows that size-optimal solutions for implementing arbitrary Boolean functions require analog circuitry. Conclusions and several comments on the required precision are presented.
Original language | English |
---|---|
Title of host publication | IEEE Vth Brazilian Symposium on Neural Networks |
DOIs | |
Publication status | Published - Dec 9 1998 |
Event | SBRN'98 - Belo Horizonte, Brazil Duration: Dec 9 1998 → … |
Conference
Conference | SBRN'98 |
---|---|
Period | 12/9/98 → … |