Effects on response time and accuracy of technology-enhanced cloze tests: an eye-tracking study
- PDF / 827,624 Bytes
- 21 Pages / 439.37 x 666.142 pts Page_size
- 6 Downloads / 161 Views
Effects on response time and accuracy of technology‑enhanced cloze tests: an eye‑tracking study Héctor R. Ponce1 · Richard E. Mayer2 · Jirarat Sitthiworachart3 · Mario J. López4
© Association for Educational Communications and Technology 2020
Abstract The transition from paper-based tests to corresponding computer-administered tests allows for the incorporation of improved interfaces that support response making. The main research question is whether innovative interfaces affect test response time and/or response accuracy. This study compared performance on banked cloze tests using a conventional interface (based on paper-based formats for responding by writing a number in a box) versus cloze tests with improved interfaces (based on computer-based affordances such as dragging and dropping responses to fill in a blank). In a banked cloze test, the left side of the page shows a text with words deleted and replaced with blanks that are numbered, and the right side shows the word list with a space to write in the corresponding number. In Experiment 1, 56 fourth graders in the conventional group responded more slowly but just as accurately, and spent more time looking at the word list on the right of the screen but spent equivalent time looking at the text as compared to a group that took the test with an improved interface. In Experiment 2, the same pattern of results was replicated with 148 sixth graders and for each of three versions of improved interfaces as compared to the conventional interface. Results support the idea that the improved interface affected the response execution phase but not the response development phase of performance on the cloze test. Keywords Computer-administered test · Technology-enhanced items · Cloze test · Dragand-drop interface · Cognitive load
Objectives and rationale Standardized tests are transitioning from their traditional paper-and-pencil implementations to similar computerized versions, such as the college admission ACT test in the US and the Programme for International Student Assessment (PISA). Computer-based assessments, however, continue to rely heavily on conventional multiple-choice items; without taking full advantages of computer functionalities. To confront this issue, test developers have begun to explore the introduction of new types of test items—referred to as * Héctor R. Ponce [email protected] Extended author information available on the last page of the article
13
Vol.:(0123456789)
H. R. Ponce et al.
technology-enhanced items (Adkins and Guerreiro 2018; Bryant 2017; Qian et al. 2018; Russell 2016) or innovative item formats (Parshall and Guille 2016; Sireci and Zenisky 2006). These innovations may include the use of video and graphics (Dindar et al. 2015), responsive interfaces (Wan and Henly 2012), and complex simulations (Pan et al. 2018). However, technology-enhanced test items involve more complex delivery settings and more research is needed on construct representation and validity, measurement impacts, and experience and performance of exa
Data Loading...