Abstract
This study proposes a new item parameter linking method for the common-item nonequivalent groups design in item response theory (IRT). Previous studies assumed that examinees are randomly assigned to either test form. However, examinees can frequently select their own test forms and tests often differ according to examinees' abilities. In such cases, concurrent calibration or multiple group IRT modeling without modeling test form selection behavior can yield severely biased results. We proposed a model wherein test form selection behavior depends on test scores and used a Monte Carlo expectation maximization (MCEM) algorithm. This method provided adequate estimates of testing parameters.
Original language | English |
---|---|
Pages (from-to) | 1-19 |
Number of pages | 19 |
Journal | Psychometrika |
Volume | 74 |
Issue number | 1 |
DOIs | |
Publication status | Published - 2009 Mar |
Externally published | Yes |
Keywords
- Common-item design
- Concurrent calibration
- IRT linking
- Item response theory
- Monte Carlo expectation maximization (MCEM) algorithm
- Multinomial logistic regression model
- Nonignorable missingness
ASJC Scopus subject areas
- Psychology(all)
- Applied Mathematics