We prove two new lower bounds for the number of bits required by neural networks for classification problems defined by m examples from IR/sup n/. They are obtained in a constructive way, and can be used for designing constructive learning algorithms. The results rely on upper bounding the space, with an n dimensional ball (V. Beiu, 1996). Recently, a better upper bound was presented by V. Beiu and T. DePauw, 1997) by showing that the volume of the ball can always be replaced by the volume of the intersection of two balls. A lower bound for the case of integer weights in the range [-p, p] was detailed by S. Draghici and I.K. Sethi (1997); it is based on computing the logarithm of the quotient between the volume of the ball containing all the examples and the maximum volume of a polyhedron. One first improvement will come from a tighter upper bound on the maximum volume of the polyhedron by two n dimensional cones (instead of the ball). A second-even better-bound will be obtained by upper bounding the space by the intersection of two balls.