State Services Commission Kiwis Count Survey

Size: px
Start display at page:

Download "State Services Commission Kiwis Count Survey"

Transcription

1 Market Research Proposal Technical Report Prepared for State Services Commission Prepared For New Zealand Police State Services Commission Kiwis Count Survey Gravitas Research and Strategy Ltd Technical Report (July December 2016) 26 th February Prepared by Gravitas Research and Strategy Ltd

2 Table of Contents Introduction and Background 1.1. Introduction Background Survey Coverage... 5 Sample Process, Weighting And Survey Method 2.1. Assessment Population and Sample Process Ensuring Representativeness and Data Weighting Survey Method Survey Frequency Quarterly Dates... 9 Questionnaire and Monthly Materials 3.1. Questionnaire Monthly Materials Used Data Collection Process 4.1. The Online Survey The Self Completion Postal Survey Data Processing - Coding and Cleaning 5.1. Coding Data Cleaning Response Rate 6.1. Response Rate Calculation A Note on Response Rates by Demographics Maximising Respondent Comfort to Sustain and Build Response Rates Mode of response Cut Off Dates for Inclusion of Data Annual Sample Composition and Weighting 7.1. Annual Sample Composition and Weighting Data Analysis Explanation 8.1. Scale Conversion for Comparisons Overall Service Quality Scores Calculation A Note on 2014 Changes to Kiwis Count Calculations Suppression Rules Appendix 1: Appendix 2: July to December 2016 Questionnaire Service Groupings 2

3 The Kiwis Count Survey and this report contain material from Citizens First, used under licence and reproduced with the permission of the Executive Director of the Institute for Citizen Centred Service. This work is licensed under the Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 New Zealand licence. [In essence, you are free to copy and distribute the work (including in other media and formats) for non-commercial purposes, as long as you attribute the work to the Crown, do not adapt the work and abide by the other licence terms.] To view a copy of this licence, visit Attribution to the Crown should be in written form and not by reproduction of any such emblem, logo or Coat of Arms. 3

4 1. Introduction and Background 1.1. Introduction The purpose of this document is to outline the technical details of the Kiwis Count Survey, including methodology, sampling, weighting and data analysis for the surveys undertaken by Gravitas between July and December From 2012 Kiwis Count has been undertaken as a continuous survey. 1 This allows the approach to the survey to be improved from month to month. This document has been developed at the completion of the 2016 Reporting Year. The State Services Commission (SSC) publishes survey findings in separate reports on a regular basis Background The Kiwis Count Survey is an all-of-government national survey measuring New Zealanders experiences of public services. Kiwis Count uses the Canadian Citizens First survey methodology. Canada is among the world leaders in providing citizen-centred public services. By adapting international best practice for use in New Zealand, this allows useful comparisons to be made between New Zealand s public service and those of an acknowledged world leader. The Kiwis Count survey was first carried out by the State Services Commission (SSC) in 2007 ( and then repeated in 2009 ( The purpose of the 2009 survey was to measure New Zealanders experiences of public services, comparing against the 2007 baseline to measure progress and identify opportunities to further improve frontline service delivery. Both the 2007 and 2009 surveys were point-in-time measures. Each was conducted via a postal survey (with an option to complete online). While providing valuable information, the outputs were periodic data releases and written reports that provided static performance measurement. In 2012, Kiwis Count changed to become a continuous survey. This change allowed the capture of trend information and provides regular updates so the survey is a business-as-usual management 1 From January 2012 to June 2016 the survey was conducted by Nielsen. 4

5 tool rather than an occasional one-off project reviewing performance. Every month, a number of New Zealanders are invited to participate in the survey. Data is provided quarterly to SSC Survey Coverage The questionnaire used between July and December 2016 was unchanged from the previous surveying wave and is based on the version introduced in February 2015 (which was based on the questions from the 2007 and 2009 surveys). (Copies of all past questionnaires are available at and the current questionnaire is reproduced in Appendix 1). The questionnaire was divided into four sections as follows: Section A: Experiences of public services (the key service quality section) Section B: Experiences of non-government services Section C: Government and the digital environment Section D: Demographic section The core of the Kiwis Count survey asks New Zealanders about their experience of using 42 government services in the past 12 months and the quality of these services (please refer to the questionnaire for a list of services included). Respondents are also asked in more detail about their most recent experience and specific questions are asked depending on the type of service experience (telephone, online, or face-to-face). Perceptions of public services versus the private sector are also evaluated, with the quality of service received from private sector organisations (such as banks, insurance companies and internet service providers) measured for benchmarking purposes. The survey concludes with an interchangeable module and demographic questions. The full questionnaire is generally reviewed every six months, to ensure it continues to meet its objectives. Feedback from respondents, received via cognitive testing and via comments made by those who have completed the survey, are also monitored to ensure the questionnaire remains easy to understand and to complete. 5

6 2. Sample Process, Weighting and Survey Method 2.1. Assessment Population and Sample Process Over the course of a year a minimum of 2000 surveys are completed (500 per quarter), with new sample selected on a monthly basis. The target population for the survey is the New Zealand population aged 18 years and older based on estimates from Statistics New Zealand. The sample frame is the New Zealand Electoral Roll which will be updated every three months to ensure the sampling frame is as up-to-date as possible. The Electoral Roll allows a random selection of named respondents as well as allowing more targeted oversampling of those of Māori ethnicity (identified as being of Māori descent in the Roll) and youth. It also allows more targeted sampling by age group, as it provides age data within five-year bands. The Electoral Roll was also the sample frame used in the previous Kiwis Count surveys. The list of potential respondents is selected at random from the full Electoral Roll (provided by the State Services team). The random sampling process follows that of previous Kiwis Count Surveys, with Māori electors sampled initially (with rates adjusted as required), followed by all non-māori descent electors (by areas, with sample drawn in proportion to the population). Within each of these samples, age is stratified, with oversampling of youth in both. Steps are also in place to ensure that the randomly selected sample is de-duplicated so that people selected at random and invited to take part in the survey are not re-contacted more than once within a 12 month period Ensuring Representativeness and Data Weighting It is very important that the survey data is representative and that, as far as possible, the distribution of responses matches the distribution of the New Zealand population aged 18 years and over. There are two key ways to address these issues: 1. Undertake a quota sample, and send out booster invites as applicable, whereby targets for completion are set around key sample characteristics (like Māori and Youth); and 2. Post-weight the data to ensure that the statistics produced more accurately represent the New Zealand population, rather than the sampled population. 6

7 We undertake an approach that includes both of these elements. Firstly, for some key parameters in each demographic/location group, quotas are for sampling, and (as applicable) booster samples of key groups are selected and sent invitation letters to ensure that the data collected in the survey doesn t significantly under-sample any key groups. Once we have taken steps to maximise representativeness in the natural sample and the data has been collected, results are weighted to reflect the distribution of the New Zealand population according to key characteristics. This is undertaken based on the weighting methodology used in previous Kiwis Count surveys, using the most updated information from Statistics NZ. Refer to Section 7 for further detail on sample weighting Survey Method Given the current scope and purpose of the Kiwis Count Survey, the method currently used is seen as the one most suitable at this point in time and is also the method that had previously been used for this survey. The survey is conducted using a postal invitation letter to an online survey and/or a self-completion postal survey. In the initial invitation letter respondents are asked to complete the survey online, then, following a reminder postcard, those who have not yet completed the survey are mailed out a questionnaire and asked to complete it themselves before posting the form back. 7

8 Figure 1 provides an overview of the survey process that is undertaken each month (please see Section 3.2 for examples of each of the materials used). Week 1 1 st Wednesday 2.4. Survey Frequency Figure 1: Monthly Survey Process Invitation Letter (The initial invitation letter outlines the purpose of the survey, an 0800 number for any questions, response dates, and a link to the Online Survey) Week 2 2 nd Wednesday Week 3 3 rd Wednesday Reminder Postcard (The initial reminder postcard is for the Online Survey) Survey Pack (Those who have not completed the online survey are sent a survey pack that includes a second cover letter*, a pen, and a self-completion Postal Survey, and a freepost returns envelope) *Note: the cover letter also references the link to the online survey as an option Week 4 4 th Wednesday 1 week (7 days) later 1 week (7 days) later 1 week (7 days) later Reminder Postcard (The final reminder postcard is for both the Postal Survey and Online Survey) The survey is run as a continuous monitor, with new sample selected and sent invitations to complete the survey each month. Note: January fieldwork is conducted in February as a double sample and December invitations are sent out a week after the November invitations. Each week the survey material that is due to be sent (invitation letter, reminder postcard, or survey pack) is sent out on a Wednesday to allow respondents time to receive and respond to materials sent the previous week. Note: Given the change in timings of NZ Post mail deliveries, the day of the week each item is sent out has been changed from previous surveys. Based on the response rates achieved in previous Kiwis Count surveys, n=442 survey invitation letters are sent out each month to achieve the annual target of at least n=2,000 completed surveys per year. 8

9 2.5. Quarterly Dates The following table shows the surveying months included under each quarter in the July to December 2016 fieldwork period Gravitas undertook, along with the cut off dates for returns to be included in each quarterly dataset and for the calculation of response rates. These dates follow those of the previous Kiwis Count Survey. Month questionnaire is sent out July 2016 August 2016 September 2016 Table 1: Months and Cut-Off Dates by Quarter Quarter Cut-off date (for quarterly data and response rate) Q rd Sunday in October 2016 October 2016 Q November 2016 December 2016* *December invitations are sent out a week after the November invitations 3 rd Sunday in January

10 3. Questionnaire and Monthly Materials 3.1. Questionnaire July to December 2016 Surveying Periods The questionnaire used in the six months of surveying between July and December 2016 was unchanged from the questionnaire used in January to June Please refer to Appendix 1 for the questionnaire used. Future Survey Changes While no major changes were made during the initial six months of surveying, during this time Gravitas worked with State Services Commission to review the current survey, internal and external reporting needs, and future strategic directions in order to develop a new questionnaire. This review resulted in a number of changes and enhancements to the questionnaire that has gone through the process of being cognitively tested. Note: The details of the questionnaire changes and the result of the cognitive testing will be included in the next technical report (covering the period when the new questionnaire is in use) Monthly Materials Used The monthly materials (invitation letter, post card reminders, etc.) used in the six months of surveying between July and December 2016 were unchanged from those used previously, with the exception of a change in contact details (from Nielsen to Gravitas) and some minor tweaking of the FAQ s. Initial Contact Invitation letter An invitation letter, which contains an overview of the survey and a link to the online survey with a unique login ID, is mailed out to all those selected from the Electoral Roll to take part in the survey. The letter also includes a list of frequently asked questions about the survey with their corresponding answers and directs respondents to a 0508 (toll-free) number and address if they have any questions about the survey. Those without internet connection or who prefer to complete the survey on paper are also able to contact Gravitas via the same methods to request a paper copy of the survey. 10

11 The invitation letter is sent on the first Wednesday of every month to the full sample of potential respondents. Second contact first reminder postcard The invitation letter is followed up with a reminder postcard, also with the survey link and a unique login ID. The reminder postcard is sent a week after the initial invitation letter (i.e. the following Wednesday) to any respondents in the sample who have not yet completed the online survey or contacted Gravitas. Third contact Survey Pack Approximately one week later, those who have not yet completed the survey online are sent a survey pack with a cover letter, hard copy questionnaire, reply paid envelope and a Kiwis Count pen. The survey link and unique login ID and password are repeated in the letter should the respondent prefer to complete online. 11

12 Final contact second reminder postcard Those who still have not replied approximately one week later receive a final reminder postcard. The final reminder postcard is sent the Wednesday following the survey pack (i.e. approximately one week later). 12

13 4. Data Collection Process 4.1. The Online Survey The online survey was programmed in-house using specialist market research software. The software offers considerable flexibility in survey design, question structures and options while allowing inclusion of design features which ensure a professional standard of presentation. The programming process also involved testing of the survey link before going live to ensure all questions and skips were functioning as they should. The survey is available in forms suited to various kinds of devices, including smartphones and tablets. The survey is hosted on our website ( with respondents directed to the webpage in the invitation letter. All randomly selected respondents are assigned a unique ID number which is included in the invitation letter, the reminder postcards and the survey packs. Respondents are asked to enter this ID number as part of the online survey as this is used to track who has completed the survey The Self Completion Postal Survey The mail out survey pack includes a returns envelope printed with Gravitas s freepost details so respondents can simply fold the questionnaire and put it in the post (no stamp is needed). Questionnaire forms are also numbered with the individual s unique ID, so any surveys returned can be linked back via the ID to the respondent/address the questionnaire has been sent out to (in order to keep track of those who have completed the survey, and those who have yet to take part). Returned postal versions of the surveys are visually checked for completion as they are returned in the mail and a daily check is made on the response achieved and the level of return to sender, incomplete or spoiled questionnaires to identify any issues. Returned completed questionnaires are then entered into a database programmed into Askia (the same software programme used for the online survey). The data entry programme is also set up with sophisticated logic checks to ensure the integrity of the data. These were developed and specified to the State Services Commission before implementation, along with an agreed coding system for any missing data (based on previous years systems for consistency). The previous data entry protocols used for this survey from 2009 were followed for consistency. 13

14 5. Data Processing - Coding and Cleaning Once data entry of the self-completion postal surveys is complete, the data is combined with the data collected from the online survey to form one main database for processing and analysis Coding Any open-ended responses, as well as those entered into Other categories, are back-coded. This involves fitting responses into existing categories, and where necessary, creating new categories so that all results have a numeric code. Coding is undertaken by an experienced member of Gravitas coding team, and all coding is checked by the Gravitas Project Manager prior to incorporation into the main database. The back coding code-frame used in previous surveys is used to ensure that the level of detail of the codes is appropriate and consistent with coding undertaken as part of the previous survey Data Cleaning Only minimal cleaning of the data set is required once surveying and data entry is complete. The need for extensive cleaning is minimised through: 1. Comprehensive pilot testing of the data entry programming and online survey being undertaken to ensure that there are no errors in the programming of the questionnaire (for example, incorrect skips/jumps between questions, multiple response questions only allowing for a single response to be entered, insufficient instructions to respondent or data entry team, etc.); 2. Use of specialist software for programming the data entry and online questionnaire. One of the key strengths of the programming is that respondents and data entry staff are unable to skip questions. The programme will not allow the respondent/data entry staff to move to a new question without entering a response to the current question (even if this is just don t know or refused ). This ensures that there are no missing values in the final data set. Similarly, the programming only allows respondents to select from response choices given on the screen; 3. The software programming also ensures that all terminated surveys (that is, those which were started but not completed by the respondent) are stored in a file separate from the completed interviews. This ensures there are no incomplete records in the final data set; and 14

15 4. Auditing by the data entry supervisory team. This process identifies any errors in the data entry of responses and allows for them to be corrected and any necessary further checks conducted prior to the compilation of the database. The actual data cleaning process is conducted at the end of every month. This cleaning process involves manual checking of the dataset by the Gravitas Data Manager to ensure each record is complete. Checking of data for each question to ensure responses given are valid (that is, are contained within the options provided) also takes place. Before databases are sent to the State Services Commission, the data cleaning process also involves removing all respondent identifiers from the database (names, addresses etc.). 15

16 6. Response Rate 6.1. Response Rate Calculation To calculate response rates, every individual sent an invitation to complete the survey is tracked and the outcome of the invitation recorded. A call-log tracks which of the letters, postcards or questionnaire packs are returned as gone, no address, as well as any telephone and notifications of refusal to participate. This log also records notifications from third parties that the nominated respondent is not available or capable to complete the survey due to age, language issues, health reasons, death or disabilities. If a respondent is having difficulty completing the survey they are able to call the Gravitas 0508 number to ask for assistance. Alternatively, the respondent can ask for someone to complete the survey on their behalf. However, the respondent must be present and it must be their experiences or opinions that are recorded and not those of the person helping them complete the questionnaire. The response rate is calculated as follows: Completed surveys / total number of invitations mailed out (excluding Gone - no address and ineligibles) x 100 Ineligibles are defined as those who are unable to participate due to age, language issues, health or disabilities. The following table is an example of response rate calculations using data from the July to September 2016 quarter of interviewing. 16

17 Table 2: Response Rate Calculation Example July to September 2016 Total surveys mailed out (a) 2592 Ineligibles (b) 218 Gone no address 169 Unable to participate (age, language, health / disability) 49 Completes (c) 1077 Online 662 Hardcopy 415 Incomplete eligibles (d) 1302 Refused (0800 number) 10 Did not hear back from 1285 Survey not fully completed 7 Response rate c/(a-b) 45% 6.2. A Note on Response Rates by Demographics It should be noted that the response rate differs by demographics and is notably lower for the booster groups Māori and youth - who make up a larger than normal proportions of the sample. For example, in the six month period from July to December 2016, n=588 survey invitations were sent out to Māori respondents with a response rate of 17%. Youth (those aged 18-24) also have a lower response rate at 30% Maximising Respondent Comfort to Sustain and Build Response Rates The State Services Commission has a strong focus on increasing this response rate over the life of the survey. The continuous approach to the survey means that changes can be made to the methodology each month and the impact of these changes on the response rate can be monitored. In order to maximise respondent comfort to sustain and build response rates the following steps were put in place: 1. A survey helpdesk was offered via a Freephone number and contact to field any queries regarding completion of the survey. 2. The questionnaire was tailored to respondents specific answers and circumstances - through appropriate skips and routing programmed into the online version of the survey and clear instructions in the paper version - maximising relevance of the survey and minimising the time required to respond. 17

18 3. An online survey process that allows respondents to complete the questionnaire at a time that suits them, with strong emphasis on the flexibility of responding so that the questionnaire does not have to be completed all at once but can be re-accessed to suit the participant. The participant is also able to backtrack to modify or check answers. 4. The option of completing a hard copy postal version of the survey if respondents are not able to complete the survey online. We will also be investigating ways to increase the number of completes and the response rate in an ongoing way, particularly among the lower responding groups of interest (i.e. Māori) Mode of Response For all completed surveys the method of completion (whether online or hardcopy) is captured in the survey tool. This allows for the proportion of completed online and hardcopy questionnaires to be calculated. The proportion of online and hardcopy completes for that period is calculated as follows: Online proportion = Completed surveys / number of online completes = X% Hardcopy proportion = Completed surveys / number of hardcopy completes =X% On average (based on response rate results since 2012), 56% of respondents are choosing to complete the survey online, with the remaining 44% returning hard copy questionnaires. However, it should be noted that in the July to September 2016 quarter the proportion of online completes (61%) was slightly higher than usual Cut Off Dates for Inclusion of Data There is limited control over when a respondent will complete this survey. For example, a respondent that is sent an invitation in February (Q1) for a number of reasons may not complete the questionnaire until May (Q2). Protocols were previously agreed for cut off dates for inclusion in a particular quarter s data and for reporting of response rate and these have been maintained for consistency. The approach that is consistently applied is that a completed questionnaire will be recognised in the month in which it was completed online or received back in the post (not the month it was sent out). In the above example, this survey will be included in the Q2 data (for processing). 18

19 For simplicity, the same protocol is applied for reporting of response rate: that is, any survey that is received after the cut-off date for the quarter will be counted in the following quarter s response rate calculation. For final dates for inclusion, please refer to Table 1. 19

20 7. Annual Sample Composition and Weighting 7.1. Annual Sample Composition and Weighting To account for factors such as sample design and non-response bias, the data is weighted each quarter before reporting. The purpose of weighting is to adjust the sample to represent the overall New Zealand population. The variables used for weighting are: Age Gender Location (TA) Ethnicity. Weighting is based on the proportions in the adult population of New Zealand (18+) in the Statistics New Zealand 2013 Census results for age, gender, location and ethnicity. As an example, the unweighted and weighted distribution of both the 2015 and 2016 database is shown below. Looking at the unweighted samples, youth made up a larger proportion of the completed surveys in 2016, while older respondents (particularly those 65+) made up a greater proportion of the completes in However, once weighted the proportions of each demographic are similar in both annual samples. Table 3: Unweighted v Weighted Distribution of 2015 & 2016 Database % of the unweighted of total % of the weighted of total Difference Difference years 9.3% 15.8% % 13.1% years 10.8% 10.3% % 15.8% years 13.4% 13.3% % 18.0% years 18.6% 18.1% % 18.9% years 19.3% 17.5% % 15.3% years 28.5% 25.0% % 19.0% 0.0 Females 55.8% 56.4% % 52.1% -0.1 Males 44.2% 43.6% % 47.9% 0.1 Asian 10.4% 10.5% % 11.0% -0.2 Māori 7.8% 8.3% % 11.1% -0.1 Other 10.9% 8.6% % 10.2% -0.6 Pacific 2.2% 2.9% % 5.2% -0.2 NZ European 74.4% 76.0% % 70.0%

21 8. Data Analysis Explanation 8.1. Scale Conversion for Comparisons The Kiwis Count surveys ask New Zealanders to rate services or express opinions using a scale from 1 to 5. This is consistent with Canada s Citizens First 4 survey, on which the Kiwis Count surveys are based. In the reports from Citizens First 4 & 5, the responses from the five point scales are converted to a scale ranging from 0 to 100. The average of these converted scores is referred to as the service quality score. To enable comparisons between Kiwis Count and Citizens First to be made, we have adopted the Canadian approach of converting the five point rating scales to service quality scores. The two scales correspond as follows: Very poor Very good Where the survey measures satisfaction, a five point scale is used where one equals very dissatisfied and five equals very satisfied. In the report when we refer to the percentage satisfied this equals ratings of four or five, the percentage neutral equals a rating of three, and the percentage dissatisfied equals ratings of one or two. Very dissatisfied Very satisfied Dissatisfied Neutral Satisfied The same approach is taken where people are asked their level of agreement with statements on various attributes of public services, where one equals strongly disagree and five equals strongly agree. Total agree equals ratings of four or five, total neutral equals a rating of three, total disagree equals ratings of one or two. Strongly disagree Strongly agree Disagree Neutral Agree 21

22 8.2. Overall Service Quality Scores Calculation The Overall Service Quality Score is calculated by rescaling the result from each respondent s five point scale (1,2,3,4,5) to a 101 point scale (0,25,50,75,100), as shown above, then calculating an average of these scores from all the services used. The overall average uses all service experiences, so a respondent who has used ten services contributes ten observations to the overall score and a respondent who has used one service contributes one observation to the overall score. This same approach is used for service groups (e.g. Health, Local Government) and individual services. Please refer to Appendix 2 for the service groupings A Note on 2016 change to calendar year reporting Continuous surveying for Kiwis Count began in January 2012 and the first data point (June 2012) was used as the 2012 result and subsequent reporting used data from July-June years as annual results. Now there are five full years of data since the continuous methodology began, we have moved to a December year end reporting period Suppression Rules Kiwis Count results for questions with unweighted sample counts of less than 50 (i.e. those answered by less than 50 people) should be used with caution. The margin of error around a result, due to sampling, increases as the sample size decreases. Results for questions with unweighted sample counts of less than 25 are suppressed. 22

23 Appendix 1: July to December 2016 Questionnaire 1

24 2

25 3

26 4

27 5

28 6

29 7

30 8

31 9

32 10

33 11

34 12

35 13

36 14

37 15

38 Appendix 2: Service Groupings The 42 individual government services are grouped in the following service groups for analysis: Passports and citizenship A passport Registering a birth, death, marriage or civil union Education and training A state or state integrated (public) school that your child attends or may attend in the future A university, polytechnic or wānanga about a course you are attending or may attend in the future Employment or retraining opportunities Applying for or receiving a student loan or student allowance A kindergarten, day-care, crèche, preschool, home-based service, playcentre, Kōhanga Reo, Aoga Amata, Puna Reo or playgroup etc. that your child attends or may attend in the future ERO (Education Review Office) school or early childhood reports Health Received outpatient services from a public hospital (includes A & E) Stayed in a public hospital Obtaining family services or counselling Used an 0800 number for health information Local government Visited a public library Your local council about rubbish or recycling (excluding the actual collection of rubbish and recycling from your household each week) Your local council about property rates Your local council about road maintenance Your local council about a building permit Environment and recreation Visited a National Park A hunting or fishing license National environmental issues or the Resources Management Act Social assistance and housing The Community Services card Accident compensation for injuries Sickness, domestic purposes or unemployment benefit A housing subsidy or accommodation supplement A childcare subsidy Living in a Housing New Zealand home A rental property bond lodgement, refund or transfer New Zealand Superannuation Border services The arrival process after landing at a New Zealand international airport from Australia The arrival process after landing at a New Zealand international airport from anywhere except Australia Importing goods into New Zealand or customs duties 16

39 Justice and security The Police (for a non-emergency situation) Paying fines or getting information about fines Emergency services i.e.111 A court, about a case you were involved with Motor vehicles Obtain, renewed, change or replace a driver licence Licensed or registered a vehicle Taxation and business Enquired about tax, receiving tax credits (such as Working for Families), Student loan repayments or KiwiSaver Contact with Statistics New Zealand for information or about taking part in a survey Importing goods into New Zealand or customs duties Registering a new company or filing an annual return for a registered company Visited sorted.org.nz for information to help manage your personal finances or retirement income Registered a business entity for tax purposes or filed a tax return 17