context
The Goal
Research
Users
Our primary demographic is full-time RVers, typically retired, who travel in large rigs. They tend to be sensitive to costs but rich in time. They are typically enthusiastic about RVing and the tools they use, and eager to provide feedback which is a real blessing.
There exists research on the effectiveness of certain survey elements, but these studies typically focus on younger working professionals, so that was in part why I wanted to conduct my own research.
The Experiment and results
I sent out 4 batches of emails, 12 users per batch: 48 emails total
Subject Line
I wanted to see if a time callout in the subject line would affect the open rate. Especially since it was short - an estimated 5 minutes - I hypothesized that this would be a draw for users.
version 1
“RV LIFE has a few questions for you - just 5 minutes!”
29 opened the initial and reminder email
version 2
30 opened the initial and reminder email
Ultimately, the open rates were nearly the same but I went with Version 1 because I felt that the additional context for users may be beneficial.
Email Copy
I know that our users varied in their tech-savviness and I theorized that some may find the word 'equipment' to be intimidating or unclear so I wanted to see if ‘devices’ worked better.
version 1
“We are looking to understand the types of equipment...”
29 opened the initial and reminder email
9 completed the survey
31% completion rate
version 2
“We are looking to understand the types of devices...”
30 opened the initial and reminder email
11 completed the survey
37% completion rate
I went with Version 2 since it had a higher completion rate.
Incentive
There exists research on the positive effect of monetary incentives on survey response rates, but those users differ from our user base.
version 1
33 users opened the initial and reminder email
11 users completed the survey
33% completion rate
version 2
$5 Gift Card to Vendor of Choice
28 users opened the initial and reminder email
10 users completed the survey
36% completion rate
The incentive-driven version performed better, and typically I advocate for offering an incentive as a token of appreciation, but I decided not to offer one here due to the high cost relative to the business decision's significance. I also felt that with this survey's length and ease, and based on the response rates during this experimentation, we wouldn’t have a hard time recruiting participants compared to some heftier research projects we had upcoming.
Interestingly, only 7 of 10 users claimed their incentive which suggests that monetary rewards was not the primary motivator.
Wrap Up
takeaways
Reminder Emails Work
It surprised me how effective reminder emails were: 27 people opened the email and 8 completed the survey. I was initially hesitant because I didn't want to pester our users, but I did find that 1 light reminder did really help our response rate.
The Practice of Sending a Pilot Survey
This approach was inspired by Caroline Jarrett's incredibly helpful book, Surveys that Work. This approach minimized the Total Survey Error and provided me with peace of mind as I launched the final survey. This iterative approach is something I’ll continue to do.
a final word
I recognize that the nature of this survey is imperfect - while trying to distinguish a single element, I still had to include elements which were not controlled. Yet, my goal had been to understand whether any of these elements would affect the survey results in a meaningful way, and I did get that answer. I found that none of the elements I tested had a significant impact - at most there was a 6% difference - which further enforced that we have a unique user base who seem to enjoy providing user feedback and engaging in user research. This was a big insight on its own.