Utilizing Response Time for Item Selection in On-the-Fly Multistage Adaptive Testing for PISA Assessment

Xiuxiu Tang, Yi Zheng, Tong Wu, Kit Tai Hau, Hua Hua Chang

Research output: Contribution to journalArticlepeer-review

Abstract

Multistage adaptive testing (MST) has been recently adopted for international large-scale assessments such as Programme for International Student Assessment (PISA). MST offers improved measurement efficiency over traditional nonadaptive tests and improved practical convenience over single-item-adaptive computerized adaptive testing (CAT). As a third alternative adaptive test design to MST and CAT, Zheng and Chang proposed the “on-the-fly multistage adaptive testing” (OMST), which combines the benefits of MST and CAT and offsets their limitations. In this study, we adopted the OMST design while also incorporating response time (RT) in item selection. Via simulations emulating the PISA 2018 reading test, including using the real item attributes and replicating PISA 2018 reading test's MST design, we compared the performance of our OMST designs against the simulated MST design in (1) measurement accuracy of test takers’ ability, (2) test time efficiency and consistency, and (3) expected gains in precision by design. We also investigated the performance of OMST in item bank usage and constraints management. Results show great potential for the proposed RT-incorporated OMST designs to be used for PISA and potentially other international large-scale assessments.

Original languageEnglish (US)
JournalJournal of Educational Measurement
DOIs
StateAccepted/In press - 2024

ASJC Scopus subject areas

  • Education
  • Developmental and Educational Psychology
  • Applied Psychology
  • Psychology (miscellaneous)

Fingerprint

Dive into the research topics of 'Utilizing Response Time for Item Selection in On-the-Fly Multistage Adaptive Testing for PISA Assessment'. Together they form a unique fingerprint.

Cite this