22–25 Jul 2025
Atlantic/Canary timezone

Real-time detection of unmotivated response behavior in questionnaires - Can immediate feedback influence future response behavior?

24 Jul 2025, 15:15
15m

Description

Unmotivated responses, identified using response times (as rapid guessing in cognitive tests, Wise & Kong, 2005; or as rapid responding in questionnaires, as part of the careless and insufficient effort responding, C/IER), are a known threat to validity (e.g. Wise, 2017). It is known from the literature that unmotivated response behavior occurs more frequently in low-stakes assessments (Wise et al., 2009), for male test takers or respondents (e.g. DeMars, Bashkov, & Socha, 2013) and with increasing item numbers (Lindner et al., 2019). However, specific psychometric models for identified rapid responses at item level (e.g., Deribo et al., 2021) or incorporating response time effort (RTE) as a process indicator at person level can only indirectly improve data quality and the validity of measurements as a post hoc correction based on already contaminated data. This paper examines how real-time detection of unmotivated response behavior during the data collection is possible and whether immediate feedback on the observed unmotivated response behavior as micro-intervention influences future responding. Using an experimental design with between-subject variation, feedback on leaving a questionnaire page that indicates missing answers, monotonous (i.e., “straightlining”) or rapid (i.e., “rapid responding”) answers in a computerized questionnaire (experimental group) is compared with feedback that only indicates missing answers (control group). The questionnaire providing log event data necessary for the identification of item-level response times in questionnaires with several items per page was administered in a nation add-on study to PISA 2022 (N=705). The position of contiguous questionnaire screens, each containing one scale, was counterbalanced using a balanced design with 18 booklets. Real-time detection of rapid responding was implemented using algorithmic processing of log events (Kroehne & Goldhammer, 2018) to extract the average answering time (AAT). AAT is known from previous analyses (Kroehne et al, in press) to show a bimodal distribution in the presence of rapid response behavior, and a conservative time threshold of 1.0 seconds was chosen for the detection of rapid response behavior in the experimental condition. The results confirm the expected bimodal distribution of the AAT, supporting the two hypothesized response processes in the experimental condition and control group. For both male and female 15-year-old students, a significant effect of the feedback on the average response time and the probability of showing quick response behavior (standardized odds ratio of 0.867 for boys and 0.734 for girls) was found. In addition to the direct effects of the micro-intervention, which remains significant when controlled for position effects, we report further indirect effects on data quality (reliability, differential item functioning and latent correlations), and present descriptive results of an inserted in-situ question to test takers on how the identified responses should be used. While the real-time detection of unmotivated response behavior affects future response behavior, the overall effect sizes of the micro-intervention are low. In the concluding section, the practical significance of the results for future computerized surveys is discussed.

Primary authors

Carolin Hahnel Frank Goldhammer Leonard Tetzlaff Lothar Persic-Beck Ulf Kroehne

Presentation materials

There are no materials yet.