Limiting the Shrinkage for the Exceptional by Objective Robust Bayesian Analysis
Pericchi Guerra, Luis Raúl
Pérez, María Eglée
MetadataShow full item record
Modern Statistics is made of the sensible combination of direct evidence (the data directly relevant or the “individual data”) and indirect evidence (the data and knowledge indirectly relevant or the “group data”). The admissible procedures are a combination of the two sources of information, and the advance of technology is making indirect evidence more substantial and ubiquitous. It has been pointed out however, that in “borrowing strength” an important problem of Statistics is to treat in a fundamentally different way exceptional cases, cases that do not adapt to the central “aurea mediocritas”. This is what has been recently coined as “the Clemente problem” (Efron, 2009). In this article we put forward that the problem is caused by the simultaneous use of square loss function and conjugate (light tailed) priors which is the usual procedure. We propose in their place to use robust penalties, in the form of losses that penalize more severely huge errors, or (equivalently) priors of heavy tails which make more probable the exceptional. Using heavy tailed prior we can reproduce in a Bayesian way, Efron and Morris’“limited translated estimators” (with Double Exponential Priors) and “discarding priors estimators” (with Cauchy-like priors) which discard the prior in the presence of conflict. Both Empirical Bayes and Full Bayes approaches are able to alleviate the Clemente Problem and furthermore beat the James-Stein estimator in terms of smaller square errors, for sensible Robust Bayes priors. We model in parallel Empirical Bayes and Fully Bayesian hierarchical models, illustrating that the differences among sensible versions of both are minute, as com- pared with the effect due to the robust assumptions. We propose a heavy tailed Beta2 distribution for variances that arises naturally as an alternative to the usual Inverted-Gamma distribution. The combination of a Cauchy Prior for location and Beta2 for scale, yields a novel closed form prior for location that we call Beta2-Cauchy, extremely suitable for Objective Robust Bayesian Analysis (ORBA).