TY - JOUR

T1 - On predictive density estimation for Gamma models with parametric constraints

AU - L'Moudden, Aziz

AU - Marchand, Éric

AU - Kortbi, Othmane

AU - Strawderman, William E.

N1 - Funding Information:
We are thankful to two anonymous reviewers for quite thoughtful and constructive comments. Eric Marchand's research is supported in part by a grant from the Natural Sciences and Engineering Research Council of Canada (grant #105806-2012), and William Strawderman's research is partially supported by grants from the Simons Foundation (#209035 and #418098).
Publisher Copyright:
© 2017 Elsevier B.V.

PY - 2017/6/1

Y1 - 2017/6/1

N2 - This paper is concerned with prediction for Gamma models, and more specifically the estimation of a predictive density for Y∼Ga(α2,β) under Kullback–Leibler loss, based on X∼Ga(α1,β). The main focus pertains to situations where there is a parametric constraint of the form β∈C=(a,b). We obtain representations for Bayes predictive densities and the minimum risk equivariant predictive density in the unconstrained problem. It is shown that the generalized Bayes estimator against the truncation of the non-informative prior onto C dominates the minimum risk equivariant predictive density and is minimax whenever a=0 or b=∞. Analytical comparisons of plug-in predictive densities Ga(α2,βˆ), which include the predictive mle density, are obtained with results applying as well for point estimation under dual entropy loss ββˆ−log(ββˆ)−1. Numerical evaluations confirm that such predictive densities are much less efficient than some Bayesian alternatives in exploiting the parametric restriction. Finally, it is shown that variance expansion improvements of the form Ga(α2k,kβˆ) of plug-in predictive densities can always be found for a subset of k>1 values and non-degenerate βˆ.

AB - This paper is concerned with prediction for Gamma models, and more specifically the estimation of a predictive density for Y∼Ga(α2,β) under Kullback–Leibler loss, based on X∼Ga(α1,β). The main focus pertains to situations where there is a parametric constraint of the form β∈C=(a,b). We obtain representations for Bayes predictive densities and the minimum risk equivariant predictive density in the unconstrained problem. It is shown that the generalized Bayes estimator against the truncation of the non-informative prior onto C dominates the minimum risk equivariant predictive density and is minimax whenever a=0 or b=∞. Analytical comparisons of plug-in predictive densities Ga(α2,βˆ), which include the predictive mle density, are obtained with results applying as well for point estimation under dual entropy loss ββˆ−log(ββˆ)−1. Numerical evaluations confirm that such predictive densities are much less efficient than some Bayesian alternatives in exploiting the parametric restriction. Finally, it is shown that variance expansion improvements of the form Ga(α2k,kβˆ) of plug-in predictive densities can always be found for a subset of k>1 values and non-degenerate βˆ.

KW - Bayes estimator

KW - Dominance

KW - Frequentist risk

KW - Gamma

KW - Kullback–Leibler loss

KW - Minimax

KW - Minimum risk equivariant

KW - Plug-in

KW - Predictive density

KW - Restricted parameter space

KW - Variance expansion

UR - http://www.scopus.com/inward/record.url?scp=85011559361&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85011559361&partnerID=8YFLogxK

U2 - 10.1016/j.jspi.2017.01.003

DO - 10.1016/j.jspi.2017.01.003

M3 - Article

AN - SCOPUS:85011559361

SN - 0378-3758

VL - 185

SP - 56

EP - 68

JO - Journal of Statistical Planning and Inference

JF - Journal of Statistical Planning and Inference

ER -