Bayesian shrinkage prediction for the regression problem

Kei Kobayashi, Fumiyasu Komaki

Research output: Contribution to journalArticle

5 Citations (Scopus)

Abstract

We consider Bayesian shrinkage predictions for the Normal regression problem under the frequentist Kullback-Leibler risk function. Firstly, we consider the multivariate Normal model with an unknown mean and a known covariance. While the unknown mean is fixed, the covariance of future samples can be different from that of training samples. We show that the Bayesian predictive distribution based on the uniform prior is dominated by that based on a class of priors if the prior distributions for the covariance and future covariance matrices are rotation invariant. Then, we consider a class of priors for the mean parameters depending on the future covariance matrix. With such a prior, we can construct a Bayesian predictive distribution dominating that based on the uniform prior. Lastly, applying this result to the prediction of response variables in the Normal linear regression model, we show that there exists a Bayesian predictive distribution dominating that based on the uniform prior. Minimaxity of these Bayesian predictions follows from these results.

Original languageEnglish
Pages (from-to)1888-1905
Number of pages18
JournalJournal of Multivariate Analysis
Volume99
Issue number9
DOIs
Publication statusPublished - 2008 Oct
Externally publishedYes

    Fingerprint

Keywords

  • 62C10
  • 62F07
  • 62F15
  • 62J07
  • Bayesian prediction
  • Kullback-Leibler divergence
  • Minimaxity
  • Normal regression
  • primary
  • secondary
  • Shrinkage estimation
  • Superharmonic function

ASJC Scopus subject areas

  • Statistics, Probability and Uncertainty
  • Numerical Analysis
  • Statistics and Probability

Cite this