TY - JOUR
T1 - Adaptation of the tuning parameter in general Bayesian inference with robust divergence
AU - Yonekura, Shouto
AU - Sugasawa, Shonosuke
N1 - Funding Information:
SY was supported by the Japan Society for the Promotion of Science (KAKENHI) under Grant Number 21K17713. SS was supported by the Japan Society for the Promotion of Science (KAKENHI) under Grant Number 21H00699.
Publisher Copyright:
© 2023, The Author(s).
PY - 2023/4
Y1 - 2023/4
N2 - We introduce a novel methodology for robust Bayesian estimation with robust divergence (e.g., density power divergence or γ-divergence), indexed by tuning parameters. It is well known that the posterior density induced by robust divergence gives highly robust estimators against outliers if the tuning parameter is appropriately and carefully chosen. In a Bayesian framework, one way to find the optimal tuning parameter would be using evidence (marginal likelihood). However, we theoretically and numerically illustrate that evidence induced by the density power divergence does not work to select the optimal tuning parameter since robust divergence is not regarded as a statistical model. To overcome the problems, we treat the exponential of robust divergence as an unnormalisable statistical model, and we estimate the tuning parameter by minimising the Hyvarinen score. We also provide adaptive computational methods based on sequential Monte Carlo samplers, enabling us to obtain the optimal tuning parameter and samples from posterior distributions simultaneously. The empirical performance of the proposed method through simulations and an application to real data are also provided.
AB - We introduce a novel methodology for robust Bayesian estimation with robust divergence (e.g., density power divergence or γ-divergence), indexed by tuning parameters. It is well known that the posterior density induced by robust divergence gives highly robust estimators against outliers if the tuning parameter is appropriately and carefully chosen. In a Bayesian framework, one way to find the optimal tuning parameter would be using evidence (marginal likelihood). However, we theoretically and numerically illustrate that evidence induced by the density power divergence does not work to select the optimal tuning parameter since robust divergence is not regarded as a statistical model. To overcome the problems, we treat the exponential of robust divergence as an unnormalisable statistical model, and we estimate the tuning parameter by minimising the Hyvarinen score. We also provide adaptive computational methods based on sequential Monte Carlo samplers, enabling us to obtain the optimal tuning parameter and samples from posterior distributions simultaneously. The empirical performance of the proposed method through simulations and an application to real data are also provided.
KW - Density power divergence
KW - General Bayes
KW - Robustness
KW - Sequential Monte Carlo
KW - Tuning parameter estimation
UR - http://www.scopus.com/inward/record.url?scp=85147495203&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85147495203&partnerID=8YFLogxK
U2 - 10.1007/s11222-023-10205-7
DO - 10.1007/s11222-023-10205-7
M3 - Article
AN - SCOPUS:85147495203
SN - 0960-3174
VL - 33
JO - Statistics and Computing
JF - Statistics and Computing
IS - 2
M1 - 39
ER -