VALUE Research Hub

What Accounts for Integrated Reading-to-Write Task Scores?

Citation

Shin, S.-Y., & Ewert, D. (2015). What Accounts for Integrated Reading-to-Write Task Scores? Language Testing, 32(2), 259–281. https://doi.org/10.1177/0265532214560257

Abstract

Reading-to-write (RTW) tasks are becoming increasingly popular and have already been used in several high-stakes English proficiency exams, either replacing or complementing a prompt-based essay test. However, it is still not clear that what accounts for successful or unsuccessful performance on an integrated reading-writing task is owing to the hybrid nature of reading and writing skills and to potential rater effects on test score variability. Thus, in this study, data-driven analytic rubrics for the RTW task were developed first. Then, the analytic subscores of 83 college ESL students' responses to the RTW task were obtained. Correlational analyses were first used for the data to explore the relationship of the writing and reading skills engaged in different aspects of the RTW task. A multivariate G-study was also applied to examine the degree of variability attributable to test takers and raters on analytic subscores. The results indicate that a RTW task may tap into both reading and writing abilities given relatively high correlations observed among composite of and separate analytic subscores, and independent reading and writing scores. The multivariate G-study results also show that each analytic rating domain could capture the difference in variability of test takers' proficiency utilized in the RTW task, and raters assigned scores neither too harshly nor too leniently across each analytic rating domain. However, the results also reveal that person and rater facets contributed to score variability differently in certain analytic categories. This study provides valuable insights into the nature of RTW tasks and has implications for rating rubric development for integrated tasks.

Themes: Correlation, English (Second Language), English Language Learners, English Teachers, Generalizability Theory, Language Proficiency, Language Tests, Multivariate Analysis, Reading Skills, Reading Tests, Scoring rubrics, Test Items, Undergraduate Students, Writing Skills, Writing Tests