Generalized predictive information criteria for the analysis of feature events

Mike K P So, Tomohiro Ando

Research output: Contribution to journalArticle

1 Citation (Scopus)

Abstract

This paper develops two weighted measures for model selection by generalizing the Kullback-Leibler divergence measure. The concept of a model selection process that takes into account the special features of the underlying model is introduced using weighted measures. New informa- tion criteria are defined using the bias correction of an expected weighted loglikelihood estimator. Using weight functions that match the features of interest in the underlying statistical models, the new information criteria are applied to simulated studies of spline regression and copula model selection. Real data applications are also given for predicting the incidence of disease and for quantile modeling of environmental data.

Original languageEnglish
Pages (from-to)742-762
Number of pages21
JournalElectronic Journal of Statistics
Volume7
Issue number1
DOIs
Publication statusPublished - 2013

Fingerprint

Information Criterion
Model Selection
Copula Models
Divergence Measure
Bias Correction
Kullback-Leibler Divergence
Quantile
Weight Function
Statistical Model
Spline
Incidence
Regression Model
Estimator
Modeling
Model selection
Information criterion
Model

Keywords

  • Feature matching
  • Information criteria
  • Model selection
  • Weighted Kullback-Leibler measure

ASJC Scopus subject areas

  • Statistics and Probability

Cite this

Generalized predictive information criteria for the analysis of feature events. / So, Mike K P; Ando, Tomohiro.

In: Electronic Journal of Statistics, Vol. 7, No. 1, 2013, p. 742-762.

Research output: Contribution to journalArticle

So, Mike K P ; Ando, Tomohiro. / Generalized predictive information criteria for the analysis of feature events. In: Electronic Journal of Statistics. 2013 ; Vol. 7, No. 1. pp. 742-762.
@article{4b035aa9ac0a4527bdec6da2e9ae43af,
title = "Generalized predictive information criteria for the analysis of feature events",
abstract = "This paper develops two weighted measures for model selection by generalizing the Kullback-Leibler divergence measure. The concept of a model selection process that takes into account the special features of the underlying model is introduced using weighted measures. New informa- tion criteria are defined using the bias correction of an expected weighted loglikelihood estimator. Using weight functions that match the features of interest in the underlying statistical models, the new information criteria are applied to simulated studies of spline regression and copula model selection. Real data applications are also given for predicting the incidence of disease and for quantile modeling of environmental data.",
keywords = "Feature matching, Information criteria, Model selection, Weighted Kullback-Leibler measure",
author = "So, {Mike K P} and Tomohiro Ando",
year = "2013",
doi = "10.1214/13-EJS788",
language = "English",
volume = "7",
pages = "742--762",
journal = "Electronic Journal of Statistics",
issn = "1935-7524",
publisher = "Institute of Mathematical Statistics",
number = "1",

}

TY - JOUR

T1 - Generalized predictive information criteria for the analysis of feature events

AU - So, Mike K P

AU - Ando, Tomohiro

PY - 2013

Y1 - 2013

N2 - This paper develops two weighted measures for model selection by generalizing the Kullback-Leibler divergence measure. The concept of a model selection process that takes into account the special features of the underlying model is introduced using weighted measures. New informa- tion criteria are defined using the bias correction of an expected weighted loglikelihood estimator. Using weight functions that match the features of interest in the underlying statistical models, the new information criteria are applied to simulated studies of spline regression and copula model selection. Real data applications are also given for predicting the incidence of disease and for quantile modeling of environmental data.

AB - This paper develops two weighted measures for model selection by generalizing the Kullback-Leibler divergence measure. The concept of a model selection process that takes into account the special features of the underlying model is introduced using weighted measures. New informa- tion criteria are defined using the bias correction of an expected weighted loglikelihood estimator. Using weight functions that match the features of interest in the underlying statistical models, the new information criteria are applied to simulated studies of spline regression and copula model selection. Real data applications are also given for predicting the incidence of disease and for quantile modeling of environmental data.

KW - Feature matching

KW - Information criteria

KW - Model selection

KW - Weighted Kullback-Leibler measure

UR - http://www.scopus.com/inward/record.url?scp=84884944055&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84884944055&partnerID=8YFLogxK

U2 - 10.1214/13-EJS788

DO - 10.1214/13-EJS788

M3 - Article

AN - SCOPUS:84884944055

VL - 7

SP - 742

EP - 762

JO - Electronic Journal of Statistics

JF - Electronic Journal of Statistics

SN - 1935-7524

IS - 1

ER -