Building a better rubric

Mixed methods rubric revision

Gerriet Janssen, Valerie Meier, Jonathan Trace

Research output: Contribution to journalArticle

13 Citations (Scopus)

Abstract

Because rubrics are the foundation of a rater's scoring process, principled rubric use requires systematic review as rubrics are adopted and adapted (Crusan, 2010, p. 72) into different local contexts. However, detailed accounts of rubric adaptations are somewhat rare. This article presents a mixed-methods (Brown, 2015) study assessing the functioning of a well-known rubric (Jacobs, Zinkgraf, Wormuth, Hartfiel, & Hugley 1981, p. 30) according to both Rasch measurement and profile analysis (n = 524), which were respectively used to analyze the scale structure and then to describe how well the rubric was classifying examinees. Upon finding that there were concerns about a lack of distinction within the rubric's scale structure, the authors decided to adapt this rubric according to theoretical and empirical criteria. The resulting scale structure was then piloted by two program outsiders and analyzed again according to Rasch measurement, placement being measured by profile analysis (n = 80). While the revised rubric can continue to be fine-tuned, this study describes how one research team developed an ongoing rubric analysis, something that these authors recommend be developed more regularly in other contexts that use high-stakes performance assessment.

Original languageEnglish
Pages (from-to)51-66
Number of pages16
JournalAssessing Writing
Volume26
DOIs
Publication statusPublished - 2015 Oct 1
Externally publishedYes

Fingerprint

performance assessment
lack
Mixed Methods

Keywords

  • Academic writing
  • Mixed-methods
  • Profile analysis
  • Rasch measurement
  • Rubrics

ASJC Scopus subject areas

  • Language and Linguistics
  • Education
  • Linguistics and Language

Cite this

Building a better rubric : Mixed methods rubric revision. / Janssen, Gerriet; Meier, Valerie; Trace, Jonathan.

In: Assessing Writing, Vol. 26, 01.10.2015, p. 51-66.

Research output: Contribution to journalArticle

Janssen, Gerriet ; Meier, Valerie ; Trace, Jonathan. / Building a better rubric : Mixed methods rubric revision. In: Assessing Writing. 2015 ; Vol. 26. pp. 51-66.
@article{cd9d644f7f16487e93e16a8f676356f1,
title = "Building a better rubric: Mixed methods rubric revision",
abstract = "Because rubrics are the foundation of a rater's scoring process, principled rubric use requires systematic review as rubrics are adopted and adapted (Crusan, 2010, p. 72) into different local contexts. However, detailed accounts of rubric adaptations are somewhat rare. This article presents a mixed-methods (Brown, 2015) study assessing the functioning of a well-known rubric (Jacobs, Zinkgraf, Wormuth, Hartfiel, & Hugley 1981, p. 30) according to both Rasch measurement and profile analysis (n = 524), which were respectively used to analyze the scale structure and then to describe how well the rubric was classifying examinees. Upon finding that there were concerns about a lack of distinction within the rubric's scale structure, the authors decided to adapt this rubric according to theoretical and empirical criteria. The resulting scale structure was then piloted by two program outsiders and analyzed again according to Rasch measurement, placement being measured by profile analysis (n = 80). While the revised rubric can continue to be fine-tuned, this study describes how one research team developed an ongoing rubric analysis, something that these authors recommend be developed more regularly in other contexts that use high-stakes performance assessment.",
keywords = "Academic writing, Mixed-methods, Profile analysis, Rasch measurement, Rubrics",
author = "Gerriet Janssen and Valerie Meier and Jonathan Trace",
year = "2015",
month = "10",
day = "1",
doi = "10.1016/j.asw.2015.07.002",
language = "English",
volume = "26",
pages = "51--66",
journal = "Assessing Writing",
issn = "1075-2935",
publisher = "Elsevier Limited",

}

TY - JOUR

T1 - Building a better rubric

T2 - Mixed methods rubric revision

AU - Janssen, Gerriet

AU - Meier, Valerie

AU - Trace, Jonathan

PY - 2015/10/1

Y1 - 2015/10/1

N2 - Because rubrics are the foundation of a rater's scoring process, principled rubric use requires systematic review as rubrics are adopted and adapted (Crusan, 2010, p. 72) into different local contexts. However, detailed accounts of rubric adaptations are somewhat rare. This article presents a mixed-methods (Brown, 2015) study assessing the functioning of a well-known rubric (Jacobs, Zinkgraf, Wormuth, Hartfiel, & Hugley 1981, p. 30) according to both Rasch measurement and profile analysis (n = 524), which were respectively used to analyze the scale structure and then to describe how well the rubric was classifying examinees. Upon finding that there were concerns about a lack of distinction within the rubric's scale structure, the authors decided to adapt this rubric according to theoretical and empirical criteria. The resulting scale structure was then piloted by two program outsiders and analyzed again according to Rasch measurement, placement being measured by profile analysis (n = 80). While the revised rubric can continue to be fine-tuned, this study describes how one research team developed an ongoing rubric analysis, something that these authors recommend be developed more regularly in other contexts that use high-stakes performance assessment.

AB - Because rubrics are the foundation of a rater's scoring process, principled rubric use requires systematic review as rubrics are adopted and adapted (Crusan, 2010, p. 72) into different local contexts. However, detailed accounts of rubric adaptations are somewhat rare. This article presents a mixed-methods (Brown, 2015) study assessing the functioning of a well-known rubric (Jacobs, Zinkgraf, Wormuth, Hartfiel, & Hugley 1981, p. 30) according to both Rasch measurement and profile analysis (n = 524), which were respectively used to analyze the scale structure and then to describe how well the rubric was classifying examinees. Upon finding that there were concerns about a lack of distinction within the rubric's scale structure, the authors decided to adapt this rubric according to theoretical and empirical criteria. The resulting scale structure was then piloted by two program outsiders and analyzed again according to Rasch measurement, placement being measured by profile analysis (n = 80). While the revised rubric can continue to be fine-tuned, this study describes how one research team developed an ongoing rubric analysis, something that these authors recommend be developed more regularly in other contexts that use high-stakes performance assessment.

KW - Academic writing

KW - Mixed-methods

KW - Profile analysis

KW - Rasch measurement

KW - Rubrics

UR - http://www.scopus.com/inward/record.url?scp=84941993180&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84941993180&partnerID=8YFLogxK

U2 - 10.1016/j.asw.2015.07.002

DO - 10.1016/j.asw.2015.07.002

M3 - Article

VL - 26

SP - 51

EP - 66

JO - Assessing Writing

JF - Assessing Writing

SN - 1075-2935

ER -