Capturing the Moment: A Feasibility Test of Geo-Fencing and Mobile App Data Collection Dan Seldin PhD. Director J.D. Power Gina Pingitore PhD. Vice-President and CRO J.D. Power Laurie Alexander Senior Manager J.D. Power Chris St. Hilaire President and CEO MFour Mobile Research
Background Most consumer surveys require respondents to complete evaluations well after their experience Some of the questions asked (e.g. time waited in line) are difficult to accurately answer a day, let along months after the experience Current technology makes it possible to capture respondents in the moment But will consumers participate? 1
Purpose of the Research Test the feasibility of using geo-fencing and mobile app surveys to capture in the moment customer feedback Primary focus: If consumers would respond while shopping or immediately after shopping How they perceived the survey experience Secondary focus: Differences between textbox vs. drop-down/scroll response options Demographics of respondents Comparison of the results to traditional online research 2
Methodology Sample: Collaboration with three sample providers with geofencing capabilities: 731 completes 947 completes 516 completes Design: Mobile panelists who were geo-located at a grocery store were invited to participate via push notification to their mobile devices Consumers were assigned to a mystery shop or a satisfaction survey condition 3
Methodology The mystery shop condition was designed to capture feedback that is difficult to capture after the experience. Customers completed 9 questions/tasks as they shopped such as: Finding an item in the store Taking a photo of the store interior, a display, and areas that were not clean Timing how long it took to checkout The satisfaction survey had customers rate various aspects of their experience (16 total questions/tasks): How long it took to checkout Professionalism of the staff Layout and design of the store 4
Methodology In each of these survey conditions, consumers were randomly assigned to one of the below response types: Textbox Drop-down Scroll Was the checkout clerk s name visible on his/her uniform? Yes Respondents validated their location by taking a picture of their receipt or filled shopping cart 5
If We Built It, Will They Come? 6
Completion Rates by Sample Provider usamp did not capture the number of push notifications at time of fielding, thus completion rates could not be calculated The overall completion rate was 23% Completion rates were similar across the two apps/panels 50% Completion Rate 40% 30% 20% 10% 24% 23% 0% Research Now MFour 7
Completion Rates by Survey Condition Completion rates were 12%-points higher when the survey was completed right after shopping versus while shopping 50% Completion Rate 40% 30% 20% 10% 18% 30% 0% Mystery Shop Satisfaction 8
Survey Completion Times Mystery shop evaluations took 9 minutes longer to complete than a satisfaction survey Satisfaction surveys with textboxes took 1 minute longer to complete than drop-down response types Incompletion rates were 6% points higher for the textbox vs. drop-down/scroll conditions 9
Satisfaction With the Survey Experience To assess the survey experience, respondents were asked: The percentage of Very easy responses were compared by survey condition and question type after controlling for vendor differences 10
Satisfaction With the Survey Experience Respondents reported the satisfaction survey was easier to complete than the mystery shop assessment No significant differences were found between textbox, scroll, and drop-down question types % Very Easy to Complete 60% 50% 40% 30% 20% 10% 0% 42% Mystery Shop 53% Satisfaction 11
How Comparable is the Data? 12
Comparing Results to Traditional Survey Research SSI panelists (n = 705) completed a full grocery store experience survey online one month after the mobile app surveys were fielded to compare: Respondent profiles Mobile app completes Mobile app incompletes Traditional online panel completes Overall panel profiles Survey results Satisfaction scores Checkout times Qualitative feedback 13
Respondent Profiles Across panels, mobile app responders and nonresponders had similar profiles Compared to traditional online panel completes and overall panel profiles, mobile app responders did not differ on: Gender Education Ethnicity Rewards program membership Mobile app responders were younger than traditional online panel members Mobile respondent mean age = 35 Traditional online panelist mean age = 40 14
Satisfaction Scores Overall satisfaction ratings via mobile apps were approximately 40-points lower (100 to 1000-point scale) than traditional online panel scores These differences held when demographics were controlled 1000 Overall Satisfaction Index 900 800 700 600 773 783 722 800 500 usamp Research Now MFour Online 15
Satisfaction Scores by Response Type Textbox scores were approximately 10-index points higher than drop-down/scroll scores 1000 Overall Satisfaction Index 900 800 700 600 735 743 Drop-down Scroll Textbox 794 778 718 727 500 usamp Research Now MFour 16
Checkout Times Checkout times increase as the distance between the experience and surveying increases Mobile app mystery shop (during) = 5 minutes Mobile app satisfaction (upon completion) = 10 minutes Online traditional panel (after) = 12 minutes The textbox response type had the most outliers and a higher mean wait times than drop-down or scroll 17
Qualitative Feedback On average, 40% of traditional online panel respondents provide optional qualitative feedback (verbatim) 18
Qualitative Feedback Mobile app mystery shop responders took one optional picture or video of something in the store to highlight their experience 19
Summary and Next Steps 20
Summary Mobile app surveys with geo-fencing are a feasible means of capturing in the moment customer feedback Overall completion rates were sufficient Surveys were perceived to be easy to complete Respondents were willing to provide multimedia feedback Some notable differences in comparison to traditional online data collection Mobile app responders were younger Overall satisfaction scores were lower Checkout times were more accurately captured Drop-down/scroll response types were faster to complete, had fewer drop-outs, and yielded better data than textboxes 21
Next Steps Further test the feasibility of this methodology for capturing other types of high-incidence, as well as low-incidence events In additional research, we tested the feasibility of capturing low-incidence events (e.g. new auto shopper) In the moment mobile apps can potentially be leveraged as an alternative to traditional mystery shops Future research should assess how many questions respondents will answer and how the data compares to mystery shop evaluations Further research is needed to optimize the survey experience Survey look and feel Response-option formatting 22
Questions and Answers 23