Robust Bayesian regression with synthetic posterior distributions

Shintaro Hashimoto, Shonosuke Sugasawa

Research output: Contribution to journalArticlepeer-review


Although linear regression models are fundamental tools in statistical science, the estimation results can be sensitive to outliers. While several robust methods have been proposed in frequentist frameworks, statistical inference is not necessarily straightforward. We here propose a Bayesian approach to robust inference on linear regression models using synthetic posterior distributions based on γ-divergence, which enables us to naturally assess the uncertainty of the estimation through the posterior distribution. We also consider the use of shrinkage priors for the regression coefficients to carry out robust Bayesian variable selection and estimation simultaneously. We develop an efficient posterior computation algorithm by adopting the Bayesian bootstrap within Gibbs sampling. The performance of the proposed method is illustrated through simulation studies and applications to famous datasets.

Original languageEnglish
Article number661
Issue number6
Publication statusPublished - 2020 Jun 1
Externally publishedYes


  • Bayesian bootstrap
  • Bayesian lasso
  • Divergence
  • Gibbs sampling
  • Linear regression

ASJC Scopus subject areas

  • Information Systems
  • Mathematical Physics
  • Physics and Astronomy (miscellaneous)
  • Electrical and Electronic Engineering


Dive into the research topics of 'Robust Bayesian regression with synthetic posterior distributions'. Together they form a unique fingerprint.

Cite this