On Optimal Computations Using Perceptrons

Valeriu Beiu

    Research output: Contribution to journalArticlepeer-review


    This paper discusses optimal solutions for implementing arbitrary Boolean functions using perceptrons. It starts by presenting neural structures and their biological inspirations, while mentioning the simplifications leading to artificial neural networks. The state-of-the-art when using neural networks as universal approximators, as well as size optimal perceptron solutions are shortly overviewed. Afterwards we detail a result of Horne and Hush (1994), showing that a neural network of perceptrons restricted to fan-in 2 can implement arbitrary Boolean functions, but requires O(2^n/n) perceptrons in O(n) layers. This result is generalised to arbitrary fan-ins, and used to prove that all the relative minimums of size are obtained for sub-linear ('small') fan-in values. On one side, this result shows a limitation of using perceptrons to implement arbitrary Boolean functions. On the other side, the result is in good agreement with hardware (i.e. VLSI implementations), where the area and the delay are directly related to fan-ins (and to the precision of the synaptic weights). The main conclusion is that discrete VLSI-efficient solutions are connectivity (fan-in) limited even when using perceptrons.
    Original languageEnglish
    Pages (from-to)33-43
    JournalJournal of Control Engineering and Applied Informatics
    Publication statusPublished - Jan 1 2002


    Dive into the research topics of 'On Optimal Computations Using Perceptrons'. Together they form a unique fingerprint.

    Cite this