A Constructive Approach to Calculating Lower Entropy Bounds

Valeriu Beiu, Sorin Draghici, Thiery De Pauw

Research output: Contribution to journalArticlepeer-review

2 Citations (Scopus)

Abstract

This paper presents a constructive approach to estimating the size of a neural network necessary to solve a given classification problem. The results are derived using an information entropy approach in the context of limited precision integer weights. Such weights are particularly suited for hardware implementations since the area they occupy is limited, and the computations performed with them can be efficiently implemented in hardware. The considerations presented use an information entropy perspective and calculate lower bounds on the number of bits needed in order to solve a given classification problem. These bounds are obtained by approximating the classification hypervolumes with the volumes of several regular (i.e., highly symmetric) n-dimensional bodies. The bounds given here allow the user to choose the appropriate size of a neural network such that: (i) the given classification problem can be solved, and (ii) the network architecture is not oversized. All considerations presented take into account the restrictive case of limited precision integer weights, and therefore can be directly applied when designing VLSI implementations of neural networks.

Original languageEnglish
Pages (from-to)1-12
Number of pages12
JournalNeural Processing Letters
Volume9
Issue number1
DOIs
Publication statusPublished - 1999
Externally publishedYes

Keywords

  • Classification problems
  • Complexity
  • Constructive algorithms
  • Limited and integer weights
  • N-dimensional volumes
  • Number of bits

ASJC Scopus subject areas

  • Software
  • General Neuroscience
  • Computer Networks and Communications
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'A Constructive Approach to Calculating Lower Entropy Bounds'. Together they form a unique fingerprint.

Cite this