Âé¶¹ÒùÔº


Why clicks and movements matter in digital survey responses

computer mouse
Credit: Pixabay/CC0 Public Domain

A new study reveals the subtle effects of survey interfaces on people's responses—and how those small differences can add up.

According to the new research, in the Journal of Consumer Research, the design of digital surveys influences how respondents engage and the answers they provide. Different formats can lead to variations in responses, highlighting the importance of considering elements. The findings underscore the need for marketers and researchers to be intentional about survey design.

From rating a rideshare driver to giving feedback after a doctor's appointment, surveys have seeped into nearly every corner of daily life. Cheap digital tools that push out questionnaires with the click of a button make feedback easier than ever to collect.

Whether they're being used by marketers seeking insight into customer preferences or academics collecting data, digital surveys succeed only if people complete them. So to keep survey takers engaged, designers fine-tune formats and try to make the experience as seamless on mobile as it is on desktop. Yet they often overlook a crucial aspect of how people interact with digital surveys: the motions people make when they select answers.

These kinesthetic properties aren't trivial, argue Melanie Brucks, Ph.D. '19, an assistant professor of marketing at Columbia University, and Jonathan Levav, a marketing professor at Stanford Graduate School of Business. These movements can change what a person pays attention to when taking a survey. And that, in turn, has the potential to alter their responses.

As the use of digital surveys surged, Brucks and Levav began to consider how physical actions might influence the psychological processes behind people's responses. "There are so many new physical ways people can respond to surveys," Brucks says—including sliders, radio button scales, and drag and drop.

"People seemed to be assuming that if you change the user interface, maybe it makes the survey more engaging or interactive, but it wouldn't actually change anything about the data you're collecting." They're "taking the as just a given," adds Levav.

Based on decades of research showing a close link between and how people think, the researchers thought that survey-takers' movements should matter. In the first study of its kind, they compared the physical movements required to complete two common survey formats and found that they can steer results.

"People don't really think about different survey format decisions as being consequential," Brucks says. "It's counterintuitive. But we're finding that these different interfaces are more consequential than you might expect."

The takeaway for marketers, designers, and researchers who create surveys is to think carefully about how interfaces may influence responses. "Bringing this link to people's attention is really important," Brucks says. "It's a call to action to be more intentional about survey design. How are people thinking about the questions? How are they responding? It's not just about what will keep them on the page."

To show that movement matters, the researchers conducted a series of 10 experiments involving nearly 23,000 participants. They partnered with research engineer David Perlman in the Stanford GSB Behavioral Lab to develop custom coding tools to track how people clicked and dragged, and how long they held down their mouse button before releasing it.

In the first study, people answered survey questions on a laptop or desktop computer. Half responded by clicking on a radio-button scale, and the other half dragged a slider along a scale.

At the start of each question, all participants had to deselect a preselected "start" button, ensuring that everyone began at the same point on the scale. They then responded to questions covering topics such as their attitudes toward thunderstorms and road trips, whether they consider themselves natural leaders, and whether they believe in a god.

Sliders vs. radio buttons

Across all types of questions—personality ratings, numerical estimates, moral judgments—people who used the slider tended to choose answers closer to the scale's starting point. For example, in response to "I am bossy" on a 1 (does not apply to me) to 8 (fully applies to me) scale, slider survey takers answered 3.77 on average and radio buttoners 4.02.

Follow-up studies confirmed these findings and ruled out alternative explanations. Brucks and Levav discovered that when a sliding scale was reversed so participants began on the right side, they tended to pick answers closer to the right.

The researchers also tracked response times: People using sliders engaged with the interface sooner than those using radio buttons, yet they took around the same time to finalize their responses as they dragged the slider over their options.

Brucks and Levav concluded that the physical act of dragging a slider engages survey takers as they consider their responses. "When people see a slider scale, they read the question, and then they immediately start using the scale," Brucks says. "They use it as their thinking mechanism; they're forming their judgment as they drag." Dragging a slider over an option boosts the chance it will be chosen.

"You need them to notice it and think about it in order for them to select it," Brucks adds. "There's a ton of research now showing that simply by bringing people's attention more to one thing, they're more likely to choose it."

While the study's effect size is small, "little differences can make a big impact," Brucks notes. For example, a 0.2-point difference in a ride share review could bump a driver's satisfaction rate to the next level or lead to the driver being removed from the platform. And when spread across thousands of responses, these small differences can add up.

One interface isn't necessarily better than any others, Brucks adds. For example, sliders could help correct the common problem of review inflation. Consumers tend to give five stars as a default, making reviews less reliable. "When there's already a bias toward one response, systematically directing people's attention toward a different option, having them think about it, could be a good thing," she says.

During the study, the researchers also stumbled upon an unexpected way of responding to a slider survey they called "whipping," where survey takers quickly swiped the slider to one end of the scale. "It was a total surprise," Brucks says. "We found that whippers are people who know right away they want to be at the end of the scale, so there's no point in dragging past each option."

The discovery of whipping goes to show that "these interactions only exist because technology introduced them, and they have psychological significance," Levav says. "It's a dimension of consumer behavior people generally haven't really thought about. And as the interactions between humans and machines only increase, we have to understand how people interact with devices to express their thoughts, feelings, decisions, and judgments."

More information: Melanie S Brucks et al, Kinesthetic Properties of Response Scales Yield Different Judgments, Journal of Consumer Research (2025).

Journal information: Journal of Consumer Research

Provided by Stanford University

Citation: Why clicks and movements matter in digital survey responses (2025, October 6) retrieved 19 October 2025 from /news/2025-10-clicks-movements-digital-survey-responses.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Do zoom meetings kill creativity?

0 shares

Feedback to editors