## Abstract

This paper is concerned with prediction for Gamma models, and more specifically the estimation of a predictive density for Y∼Ga(α_{2},β) under Kullback–Leibler loss, based on X∼Ga(α_{1},β). The main focus pertains to situations where there is a parametric constraint of the form β∈C=(a,b). We obtain representations for Bayes predictive densities and the minimum risk equivariant predictive density in the unconstrained problem. It is shown that the generalized Bayes estimator against the truncation of the non-informative prior onto C dominates the minimum risk equivariant predictive density and is minimax whenever a=0 or b=∞. Analytical comparisons of plug-in predictive densities Ga(α_{2},βˆ), which include the predictive mle density, are obtained with results applying as well for point estimation under dual entropy loss ββˆ−log(ββˆ)−1. Numerical evaluations confirm that such predictive densities are much less efficient than some Bayesian alternatives in exploiting the parametric restriction. Finally, it is shown that variance expansion improvements of the form Ga(α2k,kβˆ) of plug-in predictive densities can always be found for a subset of k>1 values and non-degenerate βˆ.

Original language | English |
---|---|

Pages (from-to) | 56-68 |

Number of pages | 13 |

Journal | Journal of Statistical Planning and Inference |

Volume | 185 |

DOIs | |

Publication status | Published - Jun 1 2017 |

## Keywords

- Bayes estimator
- Dominance
- Frequentist risk
- Gamma
- Kullback–Leibler loss
- Minimax
- Minimum risk equivariant
- Plug-in
- Predictive density
- Restricted parameter space
- Variance expansion

## ASJC Scopus subject areas

- Statistics and Probability
- Statistics, Probability and Uncertainty
- Applied Mathematics