During the last few years, France, like many other countries, has been undergoing a transition from paper-based to digital large-scale assessments in order to measure student performance in education. There is a rising interest in France in technology-enhanced items (TEIs) which offer new ways to assess traditional competencies, as well as to address higher order skills using computer-based assessments, specifically in mathematics. The rich data captured by these items, often referred to as process data, allows insight into how students tackle the item and their process strategies. “Big data” solutions were set up in France in order to handle the large volume and the complexity of the data from the technology-based assessments.
A theory driven methodology based on the findings of research in didactics of mathematics, mainly adapting the instrumental approach to assessment, enabled us to provide fine insight in students mathematical conceptions captured in TEIs.
However, process data recorded by TEIs does not provide a complete and accurate picture of students thinking and activity. A lot can happen outside of the computer-based test environment, in the context of a classroom setting devised for standardized large-scale assessment purposes. One can for instance expect students to use external non technological artifacts such as scratch paper or a ruler. TEIs cannot capture students’ attitudes, emotions or human interactions although it can be valuable information when inferring on their cognitive ability through the choice of a problem-solving strategy. Therefore, in order to validate our interpretation and use of process data, it is necessary to further inform our approach with qualitative, observational data on item response processes.
In this study, we have used fine-grained, observational data on student’s problem-solving activity in mathematics TEI using techniques of eye tracking, video analysis and think-aloud interviews, in the context of a standardized test administration in France. In-depth data has been collected on 4 students during a standardized grade 10 mathematics assessment setting in a classroom of 30 students. The study confirms that observational data on student response processes is complementary to the use of process data captured by TEIs. An example is given with a specific TEI in the domain of variation and relations. The observational data provides information about some response processes of students that are not captured by TEI log data. The study provided insights into students’ mathematical thinking in a specific testing environment, and their responses to the TEI mode and format of assessments that are seldom present in routine classroom practice. That has informed the TEI’s development process, to improve the quality of instruments, both from the didactics and technology/ergonomic points of view. Moreover, it has helped us to refine the interpretation and use of process indicators issued from the logs, thereby improving TEI validity.