A New concurrent calibration method for nonequivalent group design under nonrandom assignment

Kei Miyazaki, Takahiro Hoshino, Shin Ichi Mayekawa, Kazuo Shigemasu

Research output: Contribution to journalArticle

1 Citation (Scopus)

Abstract

This study proposes a new item parameter linking method for the common-item nonequivalent groups design in item response theory (IRT). Previous studies assumed that examinees are randomly assigned to either test form. However, examinees can frequently select their own test forms and tests often differ according to examinees' abilities. In such cases, concurrent calibration or multiple group IRT modeling without modeling test form selection behavior can yield severely biased results. We proposed a model wherein test form selection behavior depends on test scores and used a Monte Carlo expectation maximization (MCEM) algorithm. This method provided adequate estimates of testing parameters.

Original languageEnglish
Pages (from-to)1-19
Number of pages19
JournalPsychometrika
Volume74
Issue number1
DOIs
Publication statusPublished - 2009 Mar 1
Externally publishedYes

    Fingerprint

Keywords

  • Common-item design
  • Concurrent calibration
  • IRT linking
  • Item response theory
  • Monte Carlo expectation maximization (MCEM) algorithm
  • Multinomial logistic regression model
  • Nonignorable missingness

ASJC Scopus subject areas

  • Psychology(all)
  • Applied Mathematics

Cite this