A New concurrent calibration method for nonequivalent group design under nonrandom assignment

Kei Miyazaki, Takahiro Hoshino, Shin Ichi Mayekawa, Kazuo Shigemasu

Research output: Contribution to journalArticle

1 Citation (Scopus)

Abstract

This study proposes a new item parameter linking method for the common-item nonequivalent groups design in item response theory (IRT). Previous studies assumed that examinees are randomly assigned to either test form. However, examinees can frequently select their own test forms and tests often differ according to examinees' abilities. In such cases, concurrent calibration or multiple group IRT modeling without modeling test form selection behavior can yield severely biased results. We proposed a model wherein test form selection behavior depends on test scores and used a Monte Carlo expectation maximization (MCEM) algorithm. This method provided adequate estimates of testing parameters.

Original languageEnglish
Pages (from-to)1-19
Number of pages19
JournalPsychometrika
Volume74
Issue number1
DOIs
Publication statusPublished - 2009 Mar
Externally publishedYes

Fingerprint

Calibration
Concurrent
Assignment
Testing
Score Test
Monte Carlo Algorithm
Expectation-maximization Algorithm
Modeling
Linking
Biased
Design
Form
Estimate
Model

Keywords

  • Common-item design
  • Concurrent calibration
  • IRT linking
  • Item response theory
  • Monte Carlo expectation maximization (MCEM) algorithm
  • Multinomial logistic regression model
  • Nonignorable missingness

ASJC Scopus subject areas

  • Applied Mathematics
  • Psychology(all)

Cite this

A New concurrent calibration method for nonequivalent group design under nonrandom assignment. / Miyazaki, Kei; Hoshino, Takahiro; Mayekawa, Shin Ichi; Shigemasu, Kazuo.

In: Psychometrika, Vol. 74, No. 1, 03.2009, p. 1-19.

Research output: Contribution to journalArticle

Miyazaki, Kei ; Hoshino, Takahiro ; Mayekawa, Shin Ichi ; Shigemasu, Kazuo. / A New concurrent calibration method for nonequivalent group design under nonrandom assignment. In: Psychometrika. 2009 ; Vol. 74, No. 1. pp. 1-19.
@article{e7ff04d030ce49048446c17a0cae1f41,
title = "A New concurrent calibration method for nonequivalent group design under nonrandom assignment",
abstract = "This study proposes a new item parameter linking method for the common-item nonequivalent groups design in item response theory (IRT). Previous studies assumed that examinees are randomly assigned to either test form. However, examinees can frequently select their own test forms and tests often differ according to examinees' abilities. In such cases, concurrent calibration or multiple group IRT modeling without modeling test form selection behavior can yield severely biased results. We proposed a model wherein test form selection behavior depends on test scores and used a Monte Carlo expectation maximization (MCEM) algorithm. This method provided adequate estimates of testing parameters.",
keywords = "Common-item design, Concurrent calibration, IRT linking, Item response theory, Monte Carlo expectation maximization (MCEM) algorithm, Multinomial logistic regression model, Nonignorable missingness",
author = "Kei Miyazaki and Takahiro Hoshino and Mayekawa, {Shin Ichi} and Kazuo Shigemasu",
year = "2009",
month = "3",
doi = "10.1007/s11336-008-9076-x",
language = "English",
volume = "74",
pages = "1--19",
journal = "Psychometrika",
issn = "0033-3123",
publisher = "Springer New York",
number = "1",

}

TY - JOUR

T1 - A New concurrent calibration method for nonequivalent group design under nonrandom assignment

AU - Miyazaki, Kei

AU - Hoshino, Takahiro

AU - Mayekawa, Shin Ichi

AU - Shigemasu, Kazuo

PY - 2009/3

Y1 - 2009/3

N2 - This study proposes a new item parameter linking method for the common-item nonequivalent groups design in item response theory (IRT). Previous studies assumed that examinees are randomly assigned to either test form. However, examinees can frequently select their own test forms and tests often differ according to examinees' abilities. In such cases, concurrent calibration or multiple group IRT modeling without modeling test form selection behavior can yield severely biased results. We proposed a model wherein test form selection behavior depends on test scores and used a Monte Carlo expectation maximization (MCEM) algorithm. This method provided adequate estimates of testing parameters.

AB - This study proposes a new item parameter linking method for the common-item nonequivalent groups design in item response theory (IRT). Previous studies assumed that examinees are randomly assigned to either test form. However, examinees can frequently select their own test forms and tests often differ according to examinees' abilities. In such cases, concurrent calibration or multiple group IRT modeling without modeling test form selection behavior can yield severely biased results. We proposed a model wherein test form selection behavior depends on test scores and used a Monte Carlo expectation maximization (MCEM) algorithm. This method provided adequate estimates of testing parameters.

KW - Common-item design

KW - Concurrent calibration

KW - IRT linking

KW - Item response theory

KW - Monte Carlo expectation maximization (MCEM) algorithm

KW - Multinomial logistic regression model

KW - Nonignorable missingness

UR - http://www.scopus.com/inward/record.url?scp=61449153004&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=61449153004&partnerID=8YFLogxK

U2 - 10.1007/s11336-008-9076-x

DO - 10.1007/s11336-008-9076-x

M3 - Article

AN - SCOPUS:61449153004

VL - 74

SP - 1

EP - 19

JO - Psychometrika

JF - Psychometrika

SN - 0033-3123

IS - 1

ER -