We have just run a really small survey (3 pages, takes me 3 minutes to complete, 240 or so participants). Page 1 is consent + prolific ID, page 2 is demographics, page 3 is actual questions of interest. Page 3 is 10 simple likert questions.
On page 3 we are collecting timing data on all mouse clicks. Answer order does not matter, and when participants change previously selected answers the times get added for that question, so clicking quicly and then thinking does not skew these results. Have a look at these statistics:
Time taken after loading the page to respond to the first question:
Mean 8.12 seconds, median 5.89 seconds. Fastest 10%: 3.6 sec
Median time for each participant to respond to subsequent likert style questions
Mean 4.12, median 3.88, fastest 10%: 2.34 seconds
Overall time to respond to items 2-10 (after removing people that were more than 3 std slower):
Mean 62.47, median 44.31, 10%: 27.05 sec
Drilling into the fastest ~10% there:
Lastly, looking into total distribution of inter-item response times (i.e. duration between click 2 and 3, 3 and 4 and so forth), cropping the very long tail at 1 std
mean 6.12, median 3.76, fastest 10%: 1.75 seconds
And if we focus on the fastest 50% of responses
The team of three of us have also tried to answer the questions as fast as possible, while still reading the questions and thinking briefly about the answer, and our fastest attempt (out of 9) is 27 seconds for items 1-9, and a median time between responses of of 2.25 seconds. But that attempt basically didn’t involve much thinking at all.
Clearly a lot of participants are significantly faster than we are. But how fast is too fast? The under-2 second (median) response times do make me worried about the quality of our data.
What do you think?