Quality Assessments of Statistical Production Processes in Eurostat Pierre Ecochard and Małgorzata Szczęsna, Eurostat

Similar documents
The Quality Assessment Checklist Eurostat piloting experiences

The Quality supporting framework of the ESS Vision 2020

Global Assessments Peer Reviews Sector Reviews

ANALYSIS OF THE RESULTS FROM QUALITY SELF-ASSESSMENT OF STATISTICAL INFORMATION IN THE NATIONAL STATISTICAL SYSTEM OF BULGARIA

Quality assurance framework Case of Hungarian Central Statistical Office (HCSO), December 2009

Implementation of Eurostat Quality Declarations at Statistics Denmark with cost-effective use of standards

Quality Frameworks: Implementation and Impact

Quality in statistics, the quality framework of the ESS Zsuzsanna Kovács Team Leader, Quality Team Unit D4, Eurostat 24/10/2016

Jiri Krovak - CZSO, Czech Republic

Session 2: International cooperation on quality management - principles; harmonization of frameworks; peer reviews; support of countries

information exchange. To achieve this, the process must be managed through a partnership between the European Commission and the member states.

DATA QUALITY ASSESSMENT METHODS AND TOOLS IN SSO MACEDONIA

The aim of the paper is to present the experience of Statistics Lithuania in the field of quality management gained in the recent years.

National Quality Assurance Frameworks. Regional Course on 2008 SNA (Special Topics): Improving Exhaustiveness of GDP Coverage

How to map excellence in research and technological development in Europe

The 2014/2015 European Peer Reviews facilitating for statistical cooperation in the Nordic countries

REPORT ON THE EUROSTAT 2016 USER SATISFACTION SURVEY

Standard for Statistical Processes

Quality management and quality frameworks

WORLD INTELLECTUAL PROPERTY ORGANIZATION GENEVA INTERNAL AUDIT AND OVERSIGHT DIVISION INTERNAL REVIEW ON PROGRAM PERFORMANCE REPORTING PROCESS

1 Management Responsibility 1 Management Responsibility 1.1 General 1.1 General

EXECUTIVE SUMMARY 1. RECOMMENDATION FOR ACTION

Standards and processes for integrating metadata in the European Statistical System

Public Internal Control Systems in the European Union

EUROPEAN STATISTICS CODE OF PRACTICE

«FRAMEWORK OF ACTIONS FOR THE LIFELONG DEVELOPMENT OF COMPETENCIES AND QUALIFICATIONS» Evaluation report

BEST PRACTICE GUIDELINES FOR EVALUATION

The National Statistical System in Italy. Coordinating function of Istat and the quality of statistics operations in the system

PUBLIC CONSULTATION PAPER

Making the choice: Decentralized Evaluation or Review? (Orientation note) Draft for comments

Programme for the Modernisation of European Enterprise and Trade Statistics (MEETS)

COMMUNICATION FROM THE COMMISSION TO THE EUROPEAN PARLIAMENT, THE COUNCIL, THE EUROPEAN ECONOMIC AND SOCIAL COMMITTEE AND THE COMMITTEE OF THE REGIONS

APPLYING STATISTICAL PROCESS CONTROL (SPC) TO OFFICIAL STATISTICS

Guidelines on Risk Management practices among statistical organisations

Ensuring Progress delivers results

Rational Software White Paper TP 174

Business Case ESS.VIP.BUS.ADMIN. ESSnet "Quality of multisource statistics"

EUROPEAN COMMISSION ENTERPRISE AND INDUSTRY DIRECTORATE-GENERAL

Ongoing evaluation in rural development

COMMISSION STAFF WORKING DOCUMENT. Horizon Europe Stakeholder Consultation Synopsis Report. Accompanying the document.

REPORT FROM THE COMMISSION TO THE EUROPEAN PARLIAMENT, THE COUNCIL AND THE EUROPEAN ECONOMIC AND SOCIAL COMMITTEE

Report on specific multi-level GPP approaches and strategies and implementation of G.PP.S

Common Quality Framework for International Search and Preliminary Examination

The European Commission s strategy on Corporate Social Responsibility (CSR) : achievements, shortcomings and future challenges

Documentation of statistics for Business Demography 2015

PEER REVIEWERS RECOMMENDATIONS AND STATISTICS BELGIUM S IMPROVEMENT ACTIONS IN RESPONSE TO THE RECOMMENDATIONS

GAMSO. Version 1.0: 1 March 2015

Capability Maturity Model the most extensively used model in the software establishments

Drafting conventions for Auditing Guidelines and key terms for public-sector auditing

Monitoring Framework

IMPLEMENTATION OF TOTAL QUALITY MANAGEMENT MODEL IN CROATIAN BUREAU OF STATISTICS

Action List for Developing a Computer Security Incident Response Team (CSIRT)

How to plan an audit engagement

The FP7 Audit Process Handbook

Evaluation: A Canadian Government Priority Rafika Amira Danish Evaluation Society Conference 2007 Kolding, Denmark September 15, 2007

Implementing Standardised Systems, Processes, Tools and

PART 5: APPLICATION AND ASSESSMENT

BUSINESS PLAN

ENVIRONMENTAL AUDITING GUIDE TD 16/16/E

Open Government Data Assessment Report Template

TAMESIDE MBC INTERNAL AUDIT QUALITY ASSURANCE AND IMPROVEMENT PROGRAMME 2016/17

Review of Compliance. Review completed 30 June 2015 Unclassified summary released October 2015

ISO 9001:2015 Your implementation guide

COMMISSION OF THE EUROPEAN COMMUNITIES REPORT FROM THE COMMISSION TO THE EUROPEAN PARLIAMENT AND THE COUNCIL. IDA II Mid-Term Evaluation

An EMS is a management tool to improve environmental performance by providing a systematic way of managing an organization s environmental affairs.

Factsheet: Social Return on Investment (SROI) as self-assessment method for NRNs

Level 5 NVQ Diploma in Management and Leadership Complete

CLINICAL & PROFESSIONAL SUPERVISION POLICY (replacing 033/Workforce)

ISO 9001:2015 Your implementation guide

NATIONAL STATISTICAL INSTITUTE

The creation and application of a new quality management model

Summary QRP Report INSIDE THIS REPORT

Career Ladder Editable Template

POLICY DEVELOPMENT FRAMEWORK

SHORT ANSWER QUESTIONS (KEY) UNIT- I

How Can LuitBiz Help Your Company in ISO 9001:2008 Quality Certifications? Do What You Say. Improve It Say Whay You Do.

Increasing Effectiveness of Publicly Funded Innovation and Competitiveness Programs Based on IMP³rove the European Approach to Better Innovation

Quality Sheet: Documentation of Quality Indicators at Statistics Austria

Core Humanitarian Standard

Special Session Budapest, Hungary 2-4 October 2000

GSR Management System - A Guide for effective implementation

White Paper. Transforming Contact Centres using Simulation-based Scenario Modelling

European Food SCP Round Table

Using SDMX standards for dissemination of short-term indicators on the European economy 1

- International Roundtable on Business Survey Frames November 2008 OECD, Paris France

GUIDELINES FOR PUBLIC SECTOR REVIEWS AND EVALUATIONS

A Quality Assurance Framework for Knowledge Services Supporting NHSScotland

LONDON BOROUGH OF BARNET CODE OF CORPORATE GOVERNANCE

THE FUTURE OF VFM. A consideration of the challenges and potential solutions for improving its measurement and application.

QUALITY ASSESSMENT OF STATISTICS IN EUSTAT

Quality Management as Knowledge Sharing: Experiences of the Napa County Health and Human Services Agency

Public Housing Revitalization Specialist GS Career Path Guide

PEER REVIEWERS RECOMMENDATIONS AND STATISTICS NORWAY S IMPROVEMENT ACTIONS IN RESPONSE TO THE RECOMMENDATIONS

Standardisation for improvements in Statistics Norway Rune Gløersen 1 and Hans Viggo Sæbø 2

The European Commission s strategy on Corporate Social Responsibility (CSR) : achievements, shortcomings and future challenges

Quality in Statistical Organizations- Concepts and Methodological Issues

MANAGING AND MEASURING FOR RESULTS Survey highlights

Istat quality policy: Preconditions of quality assessment

Quality Management System Guidance. ISO 9001:2015 Clause-by-clause Interpretation

Non-Paper on Supporting the Development of e-democracy in the EU s Eastern Partner Countries

Transcription:

Quality Assessments of Statistical Production Processes in Eurostat Pierre Ecochard and Małgorzata Szczęsna, Eurostat Since 1994, Eurostat has developed its own approach for the measurement of the quality of statistics. A working group with Member States was set up in 1998 with the responsibility to develop the European Statistical System (ESS) standards and guidelines for assessing the quality of statistics, like the Definition of Quality [2002, 2003a], the Standard Quality Report [2003b] and the Standard Quality Indicators [2005]. With the adoption of the European Statistics Code of Practice 1 in 2005 and the Eurostat Quality Assurance Framework [Baigorri and Linden, 2007], the emphasis turned to developing quality assessment tools on top of the existing quality measurement devices. In 2008, an office-wide quality assessment exercise has been launched. It will last three years and cover almost all the statistical processes in Eurostat. This follows the recommendations of the Peer Review team which visited Eurostat in October 2007 and underlined the necessity for Eurostat to develop quality assessment and to "to implement fully the recently approved quality assurance framework, including an office-wide assessment of data quality based on quality reviews" [2007]. This assessment exercise is integrated in the Eurostat Quality Assurance Framework (QAF), which addresses statistical processes, products and user needs, covering a subset of the Code of Practice. The QAF does not cover management systems and leadership or support processes, which are tackled by other quality management tools, or the institutional environment, addressed by the ESS Peer Reviews. User needs Quality Assurance Framework Code of Practice Management systems and leadership Statistical products Support processes Production processes Institutional environment Figure 1: Coverage of the Quality Assurance Framework 1 The European Statistics Code of Practice is available in 27 languages on the Eurostat quality website: http://europa.eu.int/comm/eurostat/quality

The QAF comprises a set of methods and tools that when systematically applied to the statistical processes and outputs ensures sustainable high quality standards. Quality assessments are the central element of the QAF. Building on quality documentation, assessments are the way toward quality improvements and a step in the labelling of statistical processes. A preliminary version of the methodology of quality assessments was included in the paper presented in the CCSA meeting of September 2007 [Baigorri and Linden, 2007]. This paper presents the final version of methodology as adopted by the Eurostat Directors' Meeting in October 2007, introduces an office-wide plan for its implementation and shares some experience from the first assessments conducted. Please note that throughout this document, the phrase "statistical process" should be interpreted as meaning the production of one or several similar statistical series, from the collection of data to the dissemination of results. For instance, the labor force survey is considered to be one statistical process, and the collection, processing, analysis and dissemination of the data are considered as parts of this single statistical process. Before description of the situation in Eurostat, the next section details the results of an ESS-wide survey on quality assessment activities in national statistical institutes. Section 2 describes the methodology of quality assessments in use in Eurostat. Section 3 discusses the practical aspects of their implementation, and the lessons learned. 1. QUALITY ASSESSMENT ACTIVITIES IN THE EUROPEAN STATISTICAL SYSTEM In order to place the Eurostat quality assessment program in a wider context, this section details the results of a survey conducted by Eurostat. In the framework of Eurostat monitoring of European Statistical System compliance with the European Statistics Code of Practice, in December 2007, Eurostat sent to NSIs a questionnaire concerning their quality assessment activities. The goal was to gather information on the practical activities in NSIs regarding quality assurance, at a greater level of detail than was previously available. The questionnaire was composed of 29 questions which were mostly multiple-choice questions. This was to reduce the burden placed on respondents and collect the same information from all NSIs, to the extent possible. The questionnaire was to be filled in by the person in charge of quality in each NSI. All 31 NSIs replied; however, a few replies were incomplete. The main findings are the following: Quality reports and quality indicators are the most widely used quality assessment tools in the ESS; a significant part of this use results from the requirements of statistical legislations. In the future, many NSIs intend to extend coverage beyond legal obligations. Audits and self assessments are used in a majority of NSIs; in most NSIs, they cover only a small share of the statistical processes. Seven NSIs produce neither audits nor self assessments. The coverage of all quality assessment tools, and especially audits and self assessments, is foreseen to increase significantly throughout the ESS during the next three years. However, only two NSIs plan to produce quality reports,

quality indicators and self assessments for all processes, and two NSIs do not plan to conduct either audits or self assessments in the next three years. The results of quality assessment activities are made public by a small share of NSIs: half of them publish quality reports and quality indicators, and less than one in eight publish audit or self assessment reports. 2. METHODOLOGY FOR QUALITY ASSESSMENTS IN EUROSTAT 2.1. Background 2.1.1. The Eurostat Quality Assurance Framework On its path towards the full implementation of the European Statistics Code of Practice, Eurostat adopted a Quality assurance framework (QAF) putting the requirements of the Code into concrete by providing guidelines for improvements at output and process level. The main focus is on the level of individual statistical domains rather than the quality of the statistical system as a whole. Figure 1 presents a structure of the Eurostat QAF organising the quality assurance methods and tools into three interlinked layers: documentation and measurement, evaluation and conformity. User requirements Standards 3- Conformity Labelling 2- Evaluation Quality assessments 1- Documentation and measurement Quality reports and indicators Process descriptions User satisfaction surveys Quality improvements Statistical products Production processes User perception Figure 2: A structure for tools and methods for assessment of statistics production In the first layer, the complex information obtained from measurement and documentation has to be selected and structured in order to become meaningful for

quality assessment. For this purpose, methods and tools like key process variables (such as resources used, time used, error rates and response burden), quality indicators (like revision size, coefficient of variation, response rates), quality reports, and user satisfaction surveys are being used. Based on the information compiled in the first layer, the conformity of statistics is evaluated against (internal or external) standards. There are four different types of assessments; all based on a common Checklist (see details below). The output produced (assessment report including improvement actions, best practices and assessment diagram) show a broader picture with less details then process variables, quality indicators, quality reports and user satisfaction surveys. They also provide an overall picture of the adherence of a given statistics with the standards. At the same time, they still provide information on various quality dimensions or main processes. The third layer covers the methods of certification or labelling that will further condense the information and ensure to users and the general public the compliance with a whole set of defined standards and requirements (ISO or CoP based). Labelling has been retained in the Eurostat QAF because also aims compliance with institutional principles of the Code and can help to enhance trust and credibility in official statistics. Work on the labelling methodology is still in progress. The application of quality assessments methods requires as pre condition that information on quality is available for the statistical process to be evaluated. The situation may run from complete information (including the description of the statistical process, full quality reporting of the outputs and users views) to synthetic information on quality, usually in the form of some key quality indicators. Quality assessments take as input the existing quality information, evaluate the statistical process and its outputs against some pre-fixed standards, identify strength and weaknesses and derive the corresponding improvement actions. Fulfilling the shortcomings addressed in the improvement actions will enhance the statistical process and its outputs as well as users perception, reaching a new step in the documentation layer so that the statistical process will be ready for a subsequent evaluation. This procedure will continue until the pre-fixed standard are fully met leading to the conformity layer which includes labelling. 2.1.2. The role of quality assessments Quality assessments form the evaluation layer within the QAF. They are indispensable step toward the highest possible quality of statistics defined by the perception of the statistical product by the user, the characteristics of the statistical product(s) and the underlying statistical production process. The three aspects of quality are closely interrelated. The product quality is achieved through the production process. Different process designs will give priority to different product quality components. A process will never maximise all product quality components at a time (e. g. the trade-off between accuracy and timeliness). The way the product (and the process) is perceived by the user will often deviate from the way it is perceived by the producer. For example, the user might not always have a full overview on the entire set of quality components. He or she might also give priority to

other quality components (e. g. the famous timeliness instead of accuracy ), or have difficulties to assess the certain quality components without expert support (like accuracy). For this reason it is vital that data quality assessment also covers the question how the users actually perceive the quality of a statistical product. Data quality assessment has to take care of all three quality aspects. Focussing only on the product quality (or the process quality or the user perception respectively) will not be a sufficient solution. Quality assessments will make these choices explicit, thus fostering an informed discussion about the quality of statistics. At the same time quality assessments allow for systematically reviewing the various steps of the statistical value chain against pre-defined benchmarks, thus providing a basis for further optimisation of data quality. 2.2. Quality assessments in practice 2.2.1. A statistical process and its outputs subject of the assessment The Eurostat quality assessment methodology focuses on the statistical process and its outputs. It considers a whole chain of statistical production from the identification of user needs to the dissemination of results. Figure 2 presents the statistical process orientation of Eurostat assessments that is a common within the ESS. Another possible approach would be to assess a particular step in the production process: for instance, an assessment could focus on data collection across the whole organisation. This approach is followed by some NSIs both complementary or as an alternative to the statistical process focus. Follow-up (10) Conceptual framework (2) User needs (3) Data collection (4) Validation Country level (5) International level (6) Confidentiality (7) Documentation (8) IT conditions (11) Management, planning and legislation (12) Staff, work conditions and competence (13) Dissemination (9) Figure 3: Process orientation of the assessment Checklist. Chapter numbers are between parentheses. The statistical process oriented assessments in Eurostat are driven by the fact that a statistical process provides a relatively homogenous entity that is stable over a given period of time and allow for equal treatment in assessing the various entities and comparing the assessments. Furthermore, the identified statistical processes can serve other activities ongoing in the office such as the review of priorities, defining costs or burden and benefits analysis, etc. The processes are also very close if not identical to the basic entity "statistical survey" proposed by NSIs as the basis for a

cost assessment during the last Working Group on "ESS programming and coordination". 2.2.2. The Eurostat Statistical Processes Assessment Checklist The Eurostat methodology for quality assessment relies on the 'Eurostat Statistical Processes Assessment Checklist' (EPAC). The ESPAC examines chronologically all the steps in a given production process, from the definition of user needs to the dissemination of results. It builds on the DESAP for NSIs [Eurostat, 2003c] but underwent extensive modifications to fit the particular needs of Eurostat. The ESPAC has been designed to meet different needs. First, it is an assessment tool, which provides an overall picture of the quality of both the statistical output and the underlying statistical production process. It should be used to identify areas where improvement is most needed. Second it provides guidance in the consideration of potential improvement measures that could be implemented in the statistical production process. Third, it provides a means for comparisons of the level of quality over time and across similar domains. However, as results are subjective, it should be kept in mind that careless comparison based on the checklist can be misleading. More reliable comparison can be achieved through comprehensive quality reports. Fourth, it is a helpful tool to identify in the statistical production chain good practices throughout the organisation and promote those for application. The completion of the ESPAC allows obtaining three tangible outputs: A Summary Assessment Report presenting the principal strengths and weaknesses of the investigated domain with the resulting recommendations for improvement and identification of good practices. Identified strengths can be used for benchmarking purposes (such as setting targets or sharing of best-practices) within and between statistical organisations. Identified weaknesses can form the basis for a quality action plan that can be used for launching and monitoring of quality improvement actions. An Assessment Diagram graphically illustrating the results of quality measurement. It is useful for summarising strengths and weaknesses of the assessed statistics. If the checklist is reviewed on a regular basis (i.e. every year) the quality level of the same set of statistics can be easily monitored. The description of a good practice identified during the assessment. This will foster the adoption of these good practices by other statistical production processes. 2.2.3. Four categories of assessments Statistical processes are very diverse and this heterogeneity should be considered when setting up the quality assessments. In order to ensure the efficiency and the acceptance of the exercise, the assessments should be tailored to different profiles of statistical processes. Taking this into account, Eurostat chose to set up four categories of quality assessments self-assessments, supported self-assessments, peer reviews and rolling reviews and established some basic criteria for assigning a statistical process to an assessment category (see below). In all categories of

assessment, the ESPAC Checklist is used as a main tool for evaluation. The main difference between the categories is the degree of external intervention in the assessment. Self assessment In a self-assessment, the Checklist is filled in by the person (or team) responsible for the statistical process. The role of the quality team is to assist the domain manager during this process and to ensure, to the extent possible, the coherence of assessments across Eurostat. Supported self-assessment In a supported self-assessment, the Checklist is filled in under the responsibility of the domain manager with extended support from the quality team. Thus, the burden placed on the production unit is reduced and a high degree of coherence of assessments across statistical processes (and over time) is ensured. Peer review In a peer review, the procedure is similar to that used for a supported selfassessment, except that an expert, not belonging to the production unit, is invited to take part in the assessment. The reviewer brings in technical expertise in the domain being assessed and increased objectivity, making for greater credibility of the assessment. Rolling Review In a Rolling Review a more complex assessment of the statistical process is implemented by reviewing the statistical data, the process to produce them, the interactions with data providers and the user satisfaction. An external contractor implements the rolling reviews while being supported by the evaluation function of Eurostat. 3. IMPLEMENTATION OF QUALITY ASSESSMENT ACTIVITIES IN EUROSTAT 3.1. Principles for implementation A successful implementation of quality assessments is conditioned on the level of involvement and ownership of the exercise by the production teams. In this context, the assessments are set up taking into account the need to minimise the burden for production units and applying high flexibility regarding the timetable and the category of assessment. The assessments heavily build on the already existing documentation and measurements related to quality like quality reports, process analysis, user satisfaction surveys, etc. to avoid double work and excessive burden. Furthermore, the extensive support of Eurostat's quality unit (in levels varying for different assessment categories) is provided throughout the entire process of assessment. In order to ensure the above principles are complied with, the assessment Checklist and the assessment methodology were tested during the pilot assessments that were conducted in two Eurostat domains in 2007. The pilots were basis for the improvements of the Checklist, setting up workflows for particular categories of assessment and estimating the resource impact. The experience gathered

constituted a substantial input for the development of the methodology for an officewide quality assessment plan at Eurostat. In a wider context, the quality assessments and the QAF in general are conceived to integrate in an efficient way the existing demands on management, reporting and evaluation from the Commission and other stakeholders by providing input that should avoid repetitive work, contribute to minimizing burden for production units and allow profiting from synergies of other horizontal activities in Eurostat. 3.2. Inventory of statistical processes in Eurostat In order to establish an implementation plan for the assessments, the quality team contacted production units to establish a list of statistical production processes in Eurostat with their main characteristics. In result, an inventory of 128 processes and their characteristics has been produced. For processes considered to be assessed, Eurostat applies the following criteria for assigning an assessment category: For processes with low periodicity, no legal basis and producing low visibility outputs, self-assessments (which are the bottom line of quality assessments) seem to be suitable. For these processes the availability of limited quality reports, prior to the self-assessment might be sufficient, given the investment needed for producing full quality reports. For processes involving important financial resources and a high number of staff, with short-term or yearly periodicity, which are in the front line of user's demands of statistical outputs, rolling reviews (which is the most intensive quality assessment) should be reserved. Such rolling reviews are quite resource intensive and therefore limited to four statistical domains in the Eurostat planning for 2008. For other processes, quality assessments (with external interventions or not) should be chosen but allowing some flexibility in order to take into account the specifics of the process and the opinions/demands of the process owners. Quality documentation regarding processes and outputs has to be in place before quality assessments are conducted. For Eurostat this means in practice that quality reports, either in the form of national quality reports or EU quality reports (synthetic information from the national reports), are available. This implies that processes without quality reports, in principle would need to be excluded from quality assessments until quality reports are available. An exception could be selfassessments where basic data quality information may be sufficient. 3.3. Mapping of statistical processes with types of quality assessments Eurostat decided to consider the following characteristics for mapping groups of processes with type of quality assessment: Relevance and visibility of the output Proxy for resources consumed by the process based on the number of fulltime-equivalent (FTE) staff Degree of involvement of Eurostat in the data validation and production chain

Degree of the NSIs and NCBs involvement (can be understood as a proxy for the degree to which data is based on official sources) Justification for data collection (legal basis, gentlemen's agreement, etc) Periodicity (to identify irregular statistical processes or processes with a less than annual periodicity) Further characteristics serve the decision about the timing of the quality assessment: Availability of quality report considered a pre-requisite for the quality assessment Quality assessment already carried out and when Self- Assessment Supported Self-Assessment Peer review Rolling review Process 1 Process 2 Process 3 Process 4 Process 5 Characteristics: -Periodicity - Legal Basis - Output - ESTAT intervention Figure 4: Matching processes with assessment categories 3.4. Quality assessments plan for 2008 2010 After piloting the assessment approach in two statistical domains in 2007, in early 2008 Eurostat established an office-wide assessment plan for reviewing most of the statistical processes during a three-year period 2008-2010. The 2008 round of quality assessments was started with one hour seminars organised with each of the Eurostat Directorates concerned. During these seminars, the staff responsible for statistical processes to be assessed in 2008 received more in-depth information about the exercise. In particular, they were informed how the assessments integrate in the overall quality activities of Eurostat, about the tools used and practical details of the assessments, including the categories of the assessment. From these seminars the following 2008 work programme emerged: 16 self-assessments, 14 supported self-assessments, 1 peer review and 4 rolling reviews. The allocation of the assessment category was based on the above mentioned criteria. The assessments were set up taking into account the need to minimise the burden for production units and applying high flexibility regarding the timetable and the category of assessment. In order to further raise the staff awareness of the exercise, a series of internal communication and training activities in relation to the assessments was organised such as creation of an intranet page, lunchtime presentations, announcements on

intranet, etc. These channels will be also used to spread the results of the assessments and disseminate good practices. 3.5. Feedback from the first assessments As the assessment exercise started in January 2008, it is now possible to learn its first lessons. As of the writing of this paper, 14 assessments had been launched, of which 2 where closed. The feedback can be summarized as follows: The general workflow proved appropriate: both the assessment team and production units found it efficient and appropriate. Domain managers generally expressed their satisfaction with the assessments and found them useful for them. The Checklist itself proved flexible enough to be used in very different areas, from national accounts to survey or administrative data. An early involvement of Heads of Unit was found to have a positive impact on their degree of ownership of the assessment. As they have a key role in the implementation of improvement actions, it is very important that they should participate to the assessment from the beginning. The assessment diagram is useful to summarize the results of the assessment. However, it can be a red herring: in some cases, it can turn the attention of the production team from increasing the quality of the process and the outputs through improvement actions to trying to get "better marks". It can thus be detrimental to the very goals of the assessments. In order to avoid that to happen, it can help to introduce the diagram late in the process, and to downplay its importance. It also proved helpful to remind production units that the diagram is not to be used for comparison across domains. The ownership of the outputs of the assessment (both the answers to the Checklist and the Assessment Report) should be made very clear right from the beginning. Most importantly, in the case of a self assessment, the production unit owns the outputs. Clarifying this helps to channel the attention on improvements rather than on arguing on a particular appraisal. In the allocation of processes to assessment category, a complete freedom was given to each unit: although they received guidance from the quality unit, they had the final say. This led to an insufficient number of peer reviews, compared to supported self assessments. In the future, it will be necessary to encourage more units to choose peer review as an assessment category for their processes. 3.6. The work ahead By the end of 2008, a review of the conducted assessments and the underlying methodology will take place and be reported to the Directors' Meeting. The report will cover several aspects of the 2008 round including a possible need for adjusting instruments used (ESPAC Checklist, templates for outputs, etc), workflows and resource burden. The report will also address the identified recurrent weaknesses in the Eurostat statistical processes in order to feed improvement actions into the Eurostat Annual Management Plan (AMP) and to emphasise initiatives that need to

be implemented by horizontal units in order to facilitate quality improvements by production units. In parallel to this report, the work plan of 2009 round will be elaborated taking into account the experience from the first assessments. A series of meeting with the production teams will take place in order to agree on the timing and the category of assessments. References Baigorri, A. and Linden, H. (2007), "A Quality Assurance Framework for Eurostat". Paper presented at CCSA meeting, September 10-11 2007, Madrid. Edwards, R., Holt T. and Kopsch G. (2007), "Peer Review on the implementation of the European Statistics Code of Practice" published on the Eurostat internet site http://europa.eu.int/comm/eurostat Eurostat (2002), "Quality in the European Statistical System the way forward", Eurostat, Luxembourg, published on the Eurostat internet site: http://europa.eu.int/comm/eurostat/quality Eurostat (2003a), "Definition of Quality in Statistics" adopted by the Eurostat Working Group "Assessment of quality in statistics" at its meeting in October 2003 Eurostat (2003b), "Standard Quality Report", adopted by the Eurostat Working Group "Assessment of quality in statistics" at its meeting in October 2003 published on the Eurostat internet site: http://europa.eu.int/comm/eurostat/quality Eurostat (2003c), "Development of a Self Assessment Program (DESAP)", Eurostat granted project led by FSO Germany with project members: Statistics Austria, Statistics Finland, ISTAT Italy, Statistics Sweden and Office for National Statistics UK. Published on the Eurostat internet site: http://europa.eu.int/comm/eurostat/quality Eurostat (2005), "Standard Quality Indicators", adopted by the Eurostat Working Group 'Quality in Statistics' at its meeting in 23-25 May 2005.