What can we learn from the Understanding Society Innovation Panel?

Size: px
Start display at page:

Download "What can we learn from the Understanding Society Innovation Panel?"

Transcription

1 What can we learn from the Understanding Society Innovation Panel? Annette Jäckle Institute for Social and Economic Research (ISER) University of Essex An initiative by the Economic and Social Research Council, with scientific leadership by the Institute for Social and Economic Research, University of Essex, and survey delivery by NatCen Social Research and TNS BMRB

2 What is the Understanding Society Innovation Panel? Household panel survey Survey design mirrors main Understanding Society panel Probability sample of 1,500 households in Great Britain + refreshments All adults 16+ interviewed, annual since 2008 For methods testing, experimentation, qualitative studies Content annual interviews Understanding Society core questionnaire Core methods testing Open competition Between interviews Associated Studies: different data collection methods 2

3 Innovation Panel waves 1-9 On average 11 experiments per wave Some across multiple waves Between wave studies Qualitative studies on sub-samples (2) Web survey on full sample (1) App study on full sample (1) Known to respondents as Understanding Society survey Same branding as main survey 3

4 (1) Can we learn from the Innovation Panel? Inference treatment effects: In main Understanding Society panel In other longitudinal surveys External validity Internal validity 4

5 External validity Design mirrors main Understanding Society survey: Sample design IP excludes Northern Ireland, Scottish Highlands & Islands Time frame IP started one year earlier Following rules & tracing procedures Questionnaire instruments IP experiments Fieldwork procedures IP experiments 5

6 External validity Differential errors in Innovation Panel due to experiments?? Attrition?? Context effects?? Panel conditioning? Wave 6: Main survey GB sample (N=23,918) Innovation Panel wave 1 sample (N=1,356) 6

7 Response rates Percent Household Individual (conditional on HH response) Main survey (GB) Innovation Panel (original sample) Wave Main survey Innovation Panel 7

8 Sample composition gender Percent Main Survey Innovation Panel Male Female 8

9 Sample composition age Percent Main Survey Innovation Panel Age 9

10 Sample composition country Percent Main Survey Innovation Panel England Wales Scotland 10

11 Context effects, panel conditioning Do respondents process survey questions differently because they are being experimented with? 11

12 Survey methods experiments Question wording Generic issues Topic specific Survey procedures Participant communications Incentives Mode of interview Between wave data collection Smartphone app study Invisible? Context effects for nonexperimental questions? Visible? Context or conditioning effects? 12

13 Internal validity Effects of treatments not confounded with other factors that could influence observed outcomes Space between experiments HH grid, HH questionnaire, youth questionnaire Individual interview: average 40 mins > 20 mins: core Understanding Society modules Rest: experimental modules Allocations to treatments Fully crossed allocations for experiments affecting same outcomes Stratified random allocation for all experiments Potential contamination? Allocation to PSU, Interviewer, Household, Individual? See Lynn & Jäckle (chapter in Wiley book edited by Lavrakas et al.) Mounting multiple experiments on longitudinal social surveys: Design and implementation considerations

14 (2) What can we learn from the Innovation Panel?

15 The future of longitudinal surveys? Wearables Smartphone apps Barcode scanners, till receipts Financial aggregators Survey Clickstream data Social media Bio samples by post Store card data 15

16 Mixed mode data collection Understanding Society = CAPI Can we use mixed modes to Save costs Without increasing attrition Without affecting measurement? CATI CAPI? Experiment in IP2 Increased attrition No! Web CAPI Experiment in IP5 - IP9 2/3 HHs: Web CAPI, 1/3 HHs: CAPI only

17 Mixed mode data collection Will attrition increase if respondents don t interact with interviewers? Will we miss new joiners / leavers in HH grid? Which % of HHs will complete everything by Web? Can we increase % of Web only households? Will mixed modes affect measurement? Can we save money? Michaela Benzeval tomorrow

18 Mixed modes attrition Percent HH response rates CAPI Web CAPI Wave

19 Mixed modes HH grid Mean F2F Web Mean number of joiners / leavers per HH (waves 5-9) N joiners N leavers

20 Mixed modes web only HHs Percent % of Web CAPI HHs who complete all instruments in Web Complete & partial HHs Complete HHs Wave Bonus incentive for completing Web within 2 weeks Increases % Web only HHs

21 Mixed modes measurement Analysis of 479 variables in IP5-IP8 Differences in response distributions between CAPI and Web in 18% of variables tested But Mostly selection effects Measurement effects only in 4% of variables CAPI includes CASI module for sensitive questions High risk questions from Ethnic Minority Boost sample not in Innovation Panel Source: Jäckle et al. Identifying and predicting the effects of data collection mode on measurement

22 Biomeasures Main survey wave 2-3: nurse interview Physical measures Blood samples blood analytes, DNA, DNA methylation Medications Conditions on day Biomeasures in a mixed mode survey?

23 Biomeasures Experiments in IP wave 12 (2019): Administration method Nurse visit Interviewer leaves kit Kit by post Samples collected Full blood Blood spot Hair sample Feedback vs no feedback

24 Mobile app App to measure monthly household expenditure Scan shopping receipts Enter purchases without receipts Report no purchases that day Oct Jan 2017 Use for 1 month Own smartphone or tablet ios or Android With Kantar Worldpanel Funded by ESRC/NCRM

25 Mobile app What % of a general population sample will participate? How many will drop out during the month? What are the main barriers to participation? Are participants representative of the population?

26 Mobile app participation N=2,114 Innovation Panel respondents Participation N % Completed registration survey Used app at least once Used app at least once in each of five weeks Source: Jäckle, Burton, Couper & Lessof Participation in a Mobile App survey to collect expenditure data as part of a large-scale probability household panel: response rates and response biases

27 Mobile app dropout App users and drop-out per day: Percent Day Participants used app Participants continued in study Source: Jäckle, Burton, Couper & Lessof Participation in a Mobile App survey to collect expenditure data as part of a large-scale probability household panel: response rates and response biases

28 Mobile app barriers Probit model of participation strongest predictors: Indicators of general cooperativeness with the survey Using device every day Hypothetical willingness to download app 14% have mobile device AND hypothetically willing to download app for survey Source: Jäckle, Burton, Couper & Lessof Participation in a Mobile App survey to collect expenditure data as part of a large-scale probability household panel: response rates and response biases

29 Mobile app representativeness Socio-demographics over-represent: Women Younger age groups Higher education Financial behaviours over-represent: Keep a budget (esp. using computer or spreadsheet) Regularly check bank balance Use app to check bank balance Have store loyalty cards Correlates of household expenditure no biases? Income, subjective financial situation, household spending, late bills Source: Jäckle, Burton, Couper & Lessof Participation in a Mobile App survey to collect expenditure data as part of a large-scale probability household panel: response rates and response biases

30 Thank you. Wearables Smartphone apps Barcode scanners, till receipts Financial aggregators Survey Clickstream data Social media Bio samples by post Store card data 30