Placing Feedforward Neural Networks Among Several Circuit Complexity Classes

Valeriu Beiu, Jan A. Peperstraete, Joos Vandewalle, Rudy Lauwereins

    Research output: Chapter in Book/Report/Conference proceedingConference contribution


    This paper examines the circuit complexity of feedforward neural networks having sigmoid activation function. The starting point is the complexity class NN k defined in [18]. First two additional complexity classes NN D k and NN D,e k having less restrictive conditions (than NN k ) concerning fan-in and accuracy are defined. We then prove several relations among these three classes and well established circuit complexity classes. All the proofs are constructive as being built around binary adders. By relaxing the fan-in condition from logarithmic [18] to (almost) polynomial, the results show that substantially larger classes of sigmoid activation feedforward neural networks can be implemented in polynomial size Boolean circuits. This is done at the expense of a logarithmic factor increase in the number of layers, but having constant fan-in (of 2). The main conclusion is that there are interesting fan-in dependent depth-size tradeoffs when trying to digitally impleme...
    Original languageEnglish
    Title of host publicationWorld Conference on Neural Networks
    PublisherLawerence Erlbaum Associates, Inc., and INNS Press
    Publication statusPublished - Jun 4 1994
    EventWCNN'94 - San Diego, CA, USA
    Duration: Jun 4 1994 → …


    Period6/4/94 → …


    Dive into the research topics of 'Placing Feedforward Neural Networks Among Several Circuit Complexity Classes'. Together they form a unique fingerprint.

    Cite this