Going online

The effect of mode of delivery on performances and perceptions on an English L2 writing test suite

Tineke Brunfaut, Luke Harding, Aaron Batty

Research output: Contribution to journalArticle

2 Citations (Scopus)

Abstract

In response to changing stakeholder needs, large-scale language test providers have increasingly considered the feasibility of delivering paper-based examinations online. Evidence is required, however, to determine whether online delivery of writing tests results in changes to writing performance reflected in differential test scores across delivery modes, and whether test-takers hold favourable perceptions of online delivery. The current study aimed to determine the effect of delivery mode on the two writing tasks (reading-into-writing and extended writing) within the Trinity College London Integrated Skills in English (ISE) test suite across three proficiency levels (CEFR B1-C1). 283 test-takers (107 at ISE I/B1, 109 at ISE II/B2, and 67 at ISE III/C1) completed both writing tasks in paper-based and online mode. Test-takers also completed a questionnaire to gauge perceptions of the impact, usability and fairness of the delivery modes. Many-facet Rasch measurement (MFRM) analysis of scores revealed that delivery mode had no discernible effect, apart from the reading-into-writing task at ISE I, where the paper-based mode was slightly easier. Test-takers generally held more positive perceptions of the online delivery mode, although technical problems were reported. Findings are discussed with reference to the need for further research into interactions between delivery mode, task and level.

Original languageEnglish
JournalAssessing Writing
DOIs
Publication statusAccepted/In press - 2018 Jan 1

Fingerprint

performance
L2 Writing
fairness
stakeholder
examination
questionnaire
interaction
language
evidence

Keywords

  • Computer-based testing of writing
  • Mode of delivery
  • Online testing of writing
  • Paper-based testing of writing
  • Perceptions
  • Second language writing assessment

ASJC Scopus subject areas

  • Language and Linguistics
  • Education
  • Linguistics and Language

Cite this

Going online : The effect of mode of delivery on performances and perceptions on an English L2 writing test suite. / Brunfaut, Tineke; Harding, Luke; Batty, Aaron.

In: Assessing Writing, 01.01.2018.

Research output: Contribution to journalArticle

@article{0004909130eb474e9f9abf816855e51a,
title = "Going online: The effect of mode of delivery on performances and perceptions on an English L2 writing test suite",
abstract = "In response to changing stakeholder needs, large-scale language test providers have increasingly considered the feasibility of delivering paper-based examinations online. Evidence is required, however, to determine whether online delivery of writing tests results in changes to writing performance reflected in differential test scores across delivery modes, and whether test-takers hold favourable perceptions of online delivery. The current study aimed to determine the effect of delivery mode on the two writing tasks (reading-into-writing and extended writing) within the Trinity College London Integrated Skills in English (ISE) test suite across three proficiency levels (CEFR B1-C1). 283 test-takers (107 at ISE I/B1, 109 at ISE II/B2, and 67 at ISE III/C1) completed both writing tasks in paper-based and online mode. Test-takers also completed a questionnaire to gauge perceptions of the impact, usability and fairness of the delivery modes. Many-facet Rasch measurement (MFRM) analysis of scores revealed that delivery mode had no discernible effect, apart from the reading-into-writing task at ISE I, where the paper-based mode was slightly easier. Test-takers generally held more positive perceptions of the online delivery mode, although technical problems were reported. Findings are discussed with reference to the need for further research into interactions between delivery mode, task and level.",
keywords = "Computer-based testing of writing, Mode of delivery, Online testing of writing, Paper-based testing of writing, Perceptions, Second language writing assessment",
author = "Tineke Brunfaut and Luke Harding and Aaron Batty",
year = "2018",
month = "1",
day = "1",
doi = "10.1016/j.asw.2018.02.003",
language = "English",
journal = "Assessing Writing",
issn = "1075-2935",
publisher = "Elsevier Limited",

}

TY - JOUR

T1 - Going online

T2 - The effect of mode of delivery on performances and perceptions on an English L2 writing test suite

AU - Brunfaut, Tineke

AU - Harding, Luke

AU - Batty, Aaron

PY - 2018/1/1

Y1 - 2018/1/1

N2 - In response to changing stakeholder needs, large-scale language test providers have increasingly considered the feasibility of delivering paper-based examinations online. Evidence is required, however, to determine whether online delivery of writing tests results in changes to writing performance reflected in differential test scores across delivery modes, and whether test-takers hold favourable perceptions of online delivery. The current study aimed to determine the effect of delivery mode on the two writing tasks (reading-into-writing and extended writing) within the Trinity College London Integrated Skills in English (ISE) test suite across three proficiency levels (CEFR B1-C1). 283 test-takers (107 at ISE I/B1, 109 at ISE II/B2, and 67 at ISE III/C1) completed both writing tasks in paper-based and online mode. Test-takers also completed a questionnaire to gauge perceptions of the impact, usability and fairness of the delivery modes. Many-facet Rasch measurement (MFRM) analysis of scores revealed that delivery mode had no discernible effect, apart from the reading-into-writing task at ISE I, where the paper-based mode was slightly easier. Test-takers generally held more positive perceptions of the online delivery mode, although technical problems were reported. Findings are discussed with reference to the need for further research into interactions between delivery mode, task and level.

AB - In response to changing stakeholder needs, large-scale language test providers have increasingly considered the feasibility of delivering paper-based examinations online. Evidence is required, however, to determine whether online delivery of writing tests results in changes to writing performance reflected in differential test scores across delivery modes, and whether test-takers hold favourable perceptions of online delivery. The current study aimed to determine the effect of delivery mode on the two writing tasks (reading-into-writing and extended writing) within the Trinity College London Integrated Skills in English (ISE) test suite across three proficiency levels (CEFR B1-C1). 283 test-takers (107 at ISE I/B1, 109 at ISE II/B2, and 67 at ISE III/C1) completed both writing tasks in paper-based and online mode. Test-takers also completed a questionnaire to gauge perceptions of the impact, usability and fairness of the delivery modes. Many-facet Rasch measurement (MFRM) analysis of scores revealed that delivery mode had no discernible effect, apart from the reading-into-writing task at ISE I, where the paper-based mode was slightly easier. Test-takers generally held more positive perceptions of the online delivery mode, although technical problems were reported. Findings are discussed with reference to the need for further research into interactions between delivery mode, task and level.

KW - Computer-based testing of writing

KW - Mode of delivery

KW - Online testing of writing

KW - Paper-based testing of writing

KW - Perceptions

KW - Second language writing assessment

UR - http://www.scopus.com/inward/record.url?scp=85042566575&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85042566575&partnerID=8YFLogxK

U2 - 10.1016/j.asw.2018.02.003

DO - 10.1016/j.asw.2018.02.003

M3 - Article

JO - Assessing Writing

JF - Assessing Writing

SN - 1075-2935

ER -