When Constants are Important

Valeriu Beiu

    Research output: Chapter in Book/Report/Conference proceedingConference contribution


    In this paper the authors discuss several complexity aspects pertaining to neural networks, commonly known as the curse of dimensionality. The focus will be on: (1) size complexity and depth-size tradeoffs; (2) complexity of learning; and (3) precision and limited interconnectivity. Results have been obtained for each of these problems when dealt with separately, but few things are known as to the links among them. They start by presenting known results and try to establish connections between them. These show that they are facing very difficult problems--exponential growth in either space (i.e. precision and size) and/or time (i.e., learning and depth)--when resorting to neural networks for solving general problems. The paper will present a solution for lowering some constants, by playing on the depth-size tradeoff.
    Original languageEnglish
    Title of host publicationInternational Conference on Control System and Computer Science
    Publication statusPublished - May 28 1997
    EventCSCS-11 - Bucharest, Romania
    Duration: May 28 1997 → …


    Period5/28/97 → …


    Dive into the research topics of 'When Constants are Important'. Together they form a unique fingerprint.

    Cite this