Survey design, participant satisfaction and panel member retention. CASRO Panel Conference, February 2010 New Orleans, Louisiana

Size: px
Start display at page:

Download "Survey design, participant satisfaction and panel member retention. CASRO Panel Conference, February 2010 New Orleans, Louisiana"

Transcription

1 Survey design, participant satisfaction and panel member retention CASRO Panel Conference, February 2010 New Orleans, Louisiana

2 Agenda Introduction Objectives Methodology Survey satisfaction Survey review Future participation Comparative research Conclusions Recommendations

3 Introduction Industry challenges: Falling response rates Panel attrition Professional/untrustworthy respondents Maintaining data quality/representivity A sustainable and quality panel model requires engaged members The panel experience has been widely debated, but we argue that the survey experience is fundamental What impact does survey design have on participant satisfaction, panel member retention and data quality?

4 Objectives Which design features cause dissatisfaction and satisfaction? What is the impact on future participation? Test design features in a good vs. bad survey What is the relationship between survey design, participant behavior and data quality? Produce recommendations for better survey design

5 Methodology Survey satisfaction Analysis of survey satisfaction data: Research Now s Valued Opinions panels UK, USA, Australia, Germany, Spain, Sweden, Russia, Brazil, China & India Survey classification based on respondent ratings best/worst: UK, US, Australia & Germany Survey review satisfaction/dissatisfaction Future participation Analysis of subsequent participation of members that completed or dropped out of the best and worst surveys Comparative research Good vs. Bad survey UK panel Test design features Analyze relationship between survey design, participant behavior and data quality

6 Survey satisfaction data Scores are high, but members differentiate between good and bad surveys Survey length and reward do not determine survey satisfaction alone Average satisfaction The survey scores was user by median friendlyduration Average satisfaction scores by reward per minute Median Duration Reward per minute (GB pence)

7 Survey classification Based on panel member ratings to survey satisfaction Best 10% Highest level of agreement to survey satisfaction Midrange Worst 10% Highest level of disagreement to survey satisfaction

8 Survey review I was forced to The little choose click an and answer drag was when cute, I but did having not agree a Worst surveys million with of them any with of the no clue During options when it the part presented would end where... Unclear was seriously we had there to list should not cool three have ways The a survey company been was at could least too long an Which Repetitive design features cause dissatisfaction be and sustainable, tedious, Unbelievably 'Other' and satisfaction? asking it option was the Lots unclear same of questions whether tiresome 1 about over or and 3 my and Not relevant [confectionary] would over. I be finally overblown. considered just habits started Do but Unable to give the "most putting survey people important" anything had really failed feel down to so honest opinion identify Far without strongly too if I was many even a about buttons [consumer reading what to of confectionary] just click. There is to essentially I get lost was it the no which over will overpriced for the some last WATER? part of the of with I to am Too in-depth not. live, This space and made questions to the write difficult survey what probably to I answer Arduous didn't really get thought the attention it deserved

9 Survey review Best surveys Clear Varied Relevant User-friendly Engaging I am very excited to be a part of this type survey, it will really let me voice my opinions on something I have an I like how straight I interest liked the in, button and that interfaces. makes Was forward more enjoyable and easy the They me feel were very clear involved. and easy to Thank you than survey other was surveys to answer. use. for This allowing survey me had to more be a because No complicated of the options part I loved of this for taking "I community don't a survey Know" and variety of questions. "None" that actually than many pertained survey, to this formats something is good - that as I don't was very drink relevant Whiskey to very I really me, often enjoyed designer so that the button [accessories] was pushed drag and a lot. drop The way instructions were of answering clear and easy questions to read. as it made the task more fun The survey got me thinking more about what I would do for my children s future

10 Future participation analysis Best Worst surveys surveys Likelihood of inactivity after completing a survey 0.7% 1.5% Likelihood of inactivity after abandoning a survey 5.4% 6.0% % completing survey (excludes screen outs & quota fails) 90% 68% % abandoning What surveyis the impact on future participation? 10% 32% A panel member is three times as likely to abandon a bad survey than a good survey A panel member completing a bad survey is twice as likely to stop being an active member of the panel as one that completes a good survey The likelihood of inactivity for new members in their first month is higher 4.5% for best surveys and 6% for worst surveys

11 Active panellists Rate of decline of panels receiving good & bad surveys Number of surveys Panel A Good surveys Panel B Bad surveys After 30 surveys panel B will be 60% the size of panel A

12 Comparative research 38 questions Internet activity, electronics, TV brands, corporate responsibility, current affairs Loop repetitive Long lists Large grids Exclude don t know, no opinion & none of these No progress bar No flash, rich media, images Include don t know, no opinion & none of these Open text boxes added Standard flash, rich media and images added Question wording simplified Option to skip complex question Progress bar added Intro/info screens added to direct respondent Test design features in a good vs. bad survey 1000 interviews 500 per survey, online representative quotas 20 minutes, 100% incidence, 1 reward Research Now s UK panel December 2009

13 Design features Survey satisfaction was higher for the good survey than the bad survey There were several questions where I was asked to make positive statements about brands I had not heard of and so could not answer honestly, but was not given the opportunity to explain. Some parts of the survey were very repetitive and it was difficult to maintain interest. Really enjoyed this survey. Liked the variety of questions and the variety of ways in which you had to answer them. Excellent! 70% more comments received for the bad survey than the good survey Dissatisfaction caused by restricted answer options and inability to give an honest response, and repetition

14 Behavior Those that completed the surveys were similarly conscientious No significant differences in behavior (speeders/cheats/verbatim) You had to tick all What is the relationship between the survey boxes design and and participant behavior? For the makes I have not heard of there was no option to state this so I had to randomly pick a box even when I didn t agree with it there was no "Other" option and I would have had to lie to tick one on several questions 1 in 12 dropped out of the bad survey 40%* more dropped out of the bad survey than the good survey *5.7% of respondents dropped out of the good survey compared with 8.1% for the bad survey

15 Where did respondents drop out? Of those that abandoned the bad survey, 17% dropped out at this question

16 And the good survey? Changes to the good survey Question re-worded Respondents were not forced to enter 0 Respondents were able to skip the question A text box was added for additional comments Almost 50% fewer dropped out at this question in the good survey

17 Did the data differ? Bad survey Good survey Make your customers feel they have made the right choice What Help your is customers the relationship make the final between survey design and data quality? choice of what to buy Make your customers want to find out more Inspire your customers to talk to their friends and family about their purchase Encourage your customers to do some real research into the options available Talk to your customers shortly after they have made their purchase Make your customers start thinking about electronics in general Make your customers feel they have made the right choice Help your customers make the final choice of what to buy Encourage your customers to do some real research into the options available Make your customers want to find out more Talk to your customers shortly after they have made their purchase Inspire your customers to talk to their friends and family about their purchase Make your customers start thinking about electronics in general

18 Data quality: attitudinal, no Don t know Agreement to Foreign forces should withdraw immediately from Afghanistan Bad survey Good survey Strongly agree 23.3% 22.6% Agree 18.8% 21.8% Neither agree nor disagree 35.2% 24.4% Disagree 18.4% 16% Strongly disagree 4.3% 8.2% Don t know 7.2% Respondents select Neither agree nor disagree where a don t know option is not present do they mean the same thing?

19 Conclusions Which design features cause dissatisfaction and satisfaction? Dissatisfaction inability to give honest opinion questions or tasks that are unclear repetitive too in-depth arduous irrelevant Satisfaction relevant clear user-friendly varied Enhanced satisfaction thought-provoking interactive fun participants feel valued for their contributions

20 Conclusions What is the impact on future participation? What is the relationship between survey design, participant behavior and data quality? Future participation/loyalty Members are tolerant, but a consistently bad survey experience impacts loyalty Likelihood of inactivity is highest amongst new members Behavior Data quality Members are generally honest, but surveys do not always allow them to be Respondents are more likely to drop out of a bad survey than a good survey Small differences in questionnaire design can cause very different data results Representivity of data may be impacted if only more tolerant and loyal members complete poor surveys

21 Recommendations Respect and trust survey participants Back to basics for questionnaire design More collaboration between client, researcher and online panel owner for mutual benefit Use new techniques and innovation for continued engagement Nurture new members

22 Any questions?