Survey Design Exploration

Survey Design Exploration

The idea

The question posed was: do enough of our users use our software on devices which are 10+ years old to justify the development effort it takes to maintain the software on these devices?

I decided to run a survey to get the answer; however, I recognized that there were several elements which would impact our response rate. To address this, I ran experiments with a small subset of users to understand how elements like the subject line, email copy, and incentives would affect our response rates before launching the final survey.

View Slide Deck

The Goal

The Goal

The goal of these small experiments was to get data which was significant in practice and would guide my design decisions for the final survey.

I ultimately wanted a high response rate on the final survey which I could feel confident in, while avoiding outreach to an excessively large sample that might waste resources.

While not guaranteed, I take the view that if the response rates are low, there is more potential for non-response error - where the characteristics of those who respond differ significantly from those who do not, skewing the data.

A Note About Our Users

A Note About Our Users

Our primary demographic is full-time RVers, typically retired, who travel in large rigs. They tend to be sensitive to costs but rich in time. They are typically enthusiastic about RVing and the tools they use, and eager to provide feedback which is a real blessing.

There exists research on the effectiveness of certain survey elements, but these studies typically focus on younger working professionals, so that was in part why I wanted to conduct my own research.

The Experiment and Results

The Results

I sent out 4 batches of emails, 12 users per batch: 48 emails total

SUBJECT LINE

I wanted to see if a time callout in the subject line would affect the open rate. Especially since it was short - an estimated 5 minutes - I hypothesized that this would be a draw for users.

“RV LIFE has a few questions for you - just 5 minutes!”

29 users opened the initial and reminder email

“RV LIFE has a few questions for you!”

30 users opened the initial and reminder email

Ultimately, the open rates were nearly the same but I went with Version 1 because I felt that the additional context for users may be beneficial.

EMAIL COPY

I know that our users varied in their tech-savviness and I theorized that some may find the word 'equipment' to be intimidating or unclear so I wanted to see if ‘devices’ worked better.

“We are looking to understand the types of equipment...”

29 users opened the initial and reminder email

9 users completed the survey

31% completion rate

“We are looking to understand the types of devices...”

30 users opened the initial and reminder email

11 users completed the survey

37% completion rate

I went with Version 2 since it had a higher completion rate.

INCENTIVE

There exists research on the positive effect of monetary incentives on survey response rates, but those users differ from our user base.

No Incentive

33 users opened the initial and reminder email

11 users completed the survey

33% completion rate

$5 Gift Card to Vendor of Choice

28 users opened the initial and reminder email

10 users completed the survey

36% completion rate

The incentive-driven version performed better, and typically I advocate for offering an incentive as a token of appreciation, but I decided not to offer one here due to the high cost relative to the business decision's significance. I also felt that with this survey's length and ease, and based on the response rates during this experimentation, we wouldn’t have a hard time recruiting participants compared to some heftier research projects we had upcoming.

Interestingly, only 7 of 10 users claimed their incentive which suggests that monetary rewards was not the primary motivator.

More Takeaways

More Takeaways

Reminder Emails Work

It surprised me how effective reminder emails were: 27 people opened the email and 8 completed the survey. I was initially hesitant because I didn't want to pester our users, but I did find that 1 light reminder did really help our response rate.

The Practice of Sending a Pilot Survey

This approach was inspired by Caroline Jarrett's incredibly helpful book, Surveys that Work. This approach minimized the Total Survey Error and provided me with peace of mind as I launched the final survey. This iterative approach is something I’ll continue to do.

A Final Word

A Final Word

I recognize that the nature of this survey is imperfect - while trying to distinguish a single element, I still had to include elements which were not controlled. Yet, my goal had been to understand whether any of these elements would affect the survey results in a meaningful way, and I did get that answer. I found that none of the elements I tested had a significant impact - at most there was a 6% difference - which further enforced that we have a unique user base who seem to enjoy providing user feedback and engaging in user research. This was a big insight on its own.