The formal definition of reference priors under a general class of divergence
Metadata[+] Show full item record
Bayesian analysis is widely used recently in both theory and application of statistics. The choice of priors plays a key role in any Bayesian analysis. There are two types of priors: subjective priors and objective priors. In practice, however, the difficulties of subjective elicitation and time restrictions frequently limit us to use the objective priors constructed by some formal rules. In this dissertation, our methodology is using reference analysis to derive objective priors. Objective Bayesian inference makes inference depending only on the assumed model and the available data. The prior distribution used to make an inference is least informative in a certain information-theoretic sense. Recently, Berger, Bernardo and Sun (2009) derived reference priors rigorously in the contexts under Kullback-Leibler divergence. In special cases with common support and other regularity conditions, Ghosh, Mergel and Liu (2011) derived a general f-divergence criterion for prior selection. We generalize Ghosh, Mergel and Liu's (2011) results to the case without common support and show how an explicit expression for the reference prior can be obtained under posterior consistency. The explicit expression can be used to derive new reference priors both analytically and numerically.
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 License.