Enhanced lower entropy bounds with application to constructive learning

Valeriu Beiu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We prove two new lower bounds for the number of bits required by neural networks for classification problems defined by m examples from IR/sup n/. They are obtained in a constructive way, and can be used for designing constructive learning algorithms. The results rely on upper bounding the space, with an n dimensional ball (V. Beiu, 1996). Recently, a better upper bound was presented by V. Beiu and T. DePauw, 1997) by showing that the volume of the ball can always be replaced by the volume of the intersection of two balls. A lower bound for the case of integer weights in the range [-p, p] was detailed by S. Draghici and I.K. Sethi (1997); it is based on computing the logarithm of the quotient between the volume of the ball containing all the examples and the maximum volume of a polyhedron. One first improvement will come from a tighter upper bound on the maximum volume of the polyhedron by two n dimensional cones (instead of the ball). A second-even better-bound will be obtained by upper bounding the space by the intersection of two balls.

Original languageEnglish
Title of host publicationProceedings - 23rd Euromicro Conference
Subtitle of host publicationNew Frontiers of Information Technology, EUROMICRO 1997
Pages541-548
Number of pages8
DOIs
Publication statusPublished - 1997
Externally publishedYes
Event23rd EUROMICRO Conference on New Frontiers of Information Technology, EUROMICRO 1997 - Budapest, Hungary
Duration: Sept 1 1997Sept 4 1997

Publication series

NameConference Proceedings of the EUROMICRO
ISSN (Print)1089-6503

Conference

Conference23rd EUROMICRO Conference on New Frontiers of Information Technology, EUROMICRO 1997
Country/TerritoryHungary
CityBudapest
Period9/1/979/4/97

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Hardware and Architecture

Fingerprint

Dive into the research topics of 'Enhanced lower entropy bounds with application to constructive learning'. Together they form a unique fingerprint.

Cite this