TOOLKIT FOR EVALUATION IN EMPLOYABILITY. by Alison Barr of WBS & Partners

Size: px
Start display at page:

Download "TOOLKIT FOR EVALUATION IN EMPLOYABILITY. by Alison Barr of WBS & Partners"

Transcription

1 TOOLKIT FOR EVALUATION IN EMPLOYABILITY by Alison Barr of WBS & Partners 1

2 CONTENTS 1. Why is Evaluation Helpful? 3 2. What is being Evaluated? 3 3. Who Should Conduct the Evaluation? 5 4. The Evaluation Plan 5 5. Choosing the Correct Evaluation Methods 5 6. Analysing the Evaluation Data Reporting the Evaluation Results 11 Appendix A Example of Consent Form 12 Appendix B Recommended Sections to Include in an Evaluation Report 13 2

3 1. Why is Evaluation Helpful? When your partnership begins a new project or way of working it is essential that the successes and challenges are accurately monitored and evaluated. This will in turn lead to positive changes and maximum impact. This impact could be on anything from the partnership s culture to the service users, but always ultimately, the overall local and national employability figures. 2. What is being Evaluated? Whether your partnership is measuring its own way of working or a local project, you will have goals and a work plan designed to reach these. It is the success or otherwise of a work plan that is being evaluated. The work plan can divide into four measurable sections: inputs, processes, indicators and outcomes. Inputs are the resources needed to manage the project or approach to working. The process is the methodology used to reach the goals. TOOLKIT FOR EVALUATION IN EMPLOYABILITY Indicators are the measurable consequences of the process, which include baseline (circumstances before the project began) and output (circumstances after the project ended). Finally, the outcomes are the impact on the person, people or partnership benefiting from the end product, which if successful, is equivalent to the goals. Outcomes may include elements which were not part of the original goals. Some examples of goals and the measurable elements are shown on the following page. 3

4 Example: GOALS BASELINE INDICATORS INPUTS PROCESSES OUTPUT INDICATORS OUTCOMES Progress service user s ad- diction issues as a specific area of development within the partnership. Addiction issues are deemed to be important but are given limited time and resources within the core partnership. Staff and resources. A new addiction issues partnership sub-group is formed. The strategy and goals of the new partnership addiction issues subgroup. Two work plans designed where three partners share resources. The new partnership addiction issues subgroup has allowed resources to be shared and focused. This has progressed a specific area of development with new models of engagement and relieved pressure from the core partnership. Frontline staff to offer im- proved integrated service to service users. Staff have limited knowledge and understanding of the services they can integrate for service users. Staff and training service. Training is organised, delivered and evaluated. Focused on innovation & new thinking around integration of service. The number of people who received training, and their assessment of learning. Action plans on integration created and operational. The people who received the training can now offer an improved integrated service to users. This has increased the number of people accessing appropriate services, utilised resources and shorten the time from engagement to placement. Increase the number of em- ployers employing new staff through a training scheme. 10 employers (15% of those offered) have employed staff through a training scheme. Employers and support services. Employers are offered new onsite support services. The number of employers who received support, and evaluated the difference it made to their perception. Employers employing new staff through a training scheme has increased to 25% with positive feedback. 4

5 3. Who Should Conduct the Evaluation? TOOLKIT FOR EVALUATION IN EMPLOYABILITY It is possible that those close to a project may find it challenging to conduct the evaluation with no bias. In ideal circumstances the person leading the evaluation will be separate to the project and have nothing to gain from either positive or negative results. It might be useful here to offer an exchange evaluation service with another partnership e.g. an exchange of personnel for a short period or a few days and evaluate each others projects. 4. The Evaluation Plan Write a clear and concise evaluation plan with explanations of decisions made and timetables of work. Write this with the thought that somebody outwith the organisation who does not have prior knowledge of your work will be able to understand the process. 5. Choosing the Correct Evaluation Methods Identify the following to design the appropriate evaluation methods for your work plan. 1. The goals of the work plan / the desired outcomes. 2. The available inputs. 3. The indicators (measurable elements). When these fundamentals have been identified the process can be designed (i.e. the methods used to measure the indicators ). There is a vast array of methods available for use at the process stage, most of which are unnecessarily complex for Employability work plans. Remember to measure both the baseline and output indicators. It is common for pluralistic methods to be used to give a full picture, i.e. evaluating in both qualitative and quantitative styles. For example, a questionnaire is used to collect quantitative data from a large group of people. People who respond with certain answers are then invited for interview to gather more in-depth qualitative data. In one or more particularly interesting situations, case studies are formulated to illustrate the results of the work plan. 5

6 6

7 7

8 Following is a summary of the purpose and best uses of the techniques detailed in the flow-charts above. METHOD PURPOSE WHEN TO USE WHEN NOT TO USE Questionnaires n.b. A question s word-bias may result in incorrect data. These can include yes/no questions and rating questions. To obtain personal views and/ or demographic information with minimal resources. Can be anonymous. Usually fully quantitative or mostly quantitative with minimal qualitative questions. When honest answers are vital. When a large sample is needed. When closed questions can be used. When statistical comparisons are required. When data analysis must be scientifically robust. When in-depth qualitative data is required. When a personal approach is needed. Interviews Focus Groups To obtain in-depth personal opinions. n.b. The interviewer must not have a bias. To obtain in-depth personal opinions through group discussion, where similarities and discrepancies in opinions will become apparent. n.b. The facilitator must be experienced in group work. As a follow-up to a questionnaire. When in-depth information is needed from a small number of people. When a personal approach is needed. When honest answers to qualitative questions are important. As a follow-up to a questionnaire or interview. When a comparison of in-depth information is needed from a small or medium data set.. When a personal approach is needed When broad opinion of a public or specific group is required. When data analysis must be scientifically robust. When fully honest answers are vital. When there is a medium to large data set. When fully honest answers to qualitative questions are vital. When there is a large data set. When data analysis must be scientifically robust. 8

9 Case Studies To obtain an in-depth account of one or more projects, available for comparison. As a follow-up to a questionnaire. When the entire process from goals to outcomes needs to be recorded succinctly within a limited number of words. When comparison studies are needed. When depth of information is more important than breadth. When a personal approach is needed. When breadth of information is more important than depth. When fully honest answers are vital. When data analysis must be scientifically robust. When there is a large data set. Documentation Review To obtain a picture of progress based on reviewing existing documents, e.g. applications, minutes, finances, reports, etc. When it is vital that the evaluation technique does not interrupt the flow of the work. When the documents available give a comprehensive picture of the progress. When facts are more important than opinions or demographic data. When opinions or demographic data are more important than facts. When incomplete information does not allow an accurate picture. Please note that if you are involving service users personal information in your evaluation they need to first give you permission, which should be a signed consent form. If a service user chooses not to give this permission this should not affect any service they are entitled to. See Appendix A for an example of a consent form. 9

10 6. Analysing the Evaluation Data TOOLKIT FOR EVALUATION IN EMPLOYABILITY Analysis of your data is one of the most critical points of your evaluation. It is also one of the most underestimated activities. Use the original goals as a basis for evaluation. Use the data collected to answer the questions the goals asked, e.g.: Has there been development within the partnership in relation to the specific area of service user s addiction issues? Are the frontline staff able to offer an improved service to service users? Is there an increase in the number of employers taking on new staff under a training scheme? Analysing Qualitative Data To analyse qualitative data: Identify themes and sub-themes. Identify patterns with respondents. Quantify results, e.g. 60% of respondents felt they received an improved service after the training schedule was completed. For more information on qualitative results reporting, see: Analysing Quantitative Data Quantitative data can be used to show the results in the following ways: Total figures Percentage Mean (the arithmetic total of all elements of data set divided by the number of elements) Mode (the data element which occurs most frequently) Midrange (the arithmetic mean of the highest and lowest data elements) Median (the middle element when the data set is arranged in order of magnitude) For more information on qualitative results reporting, see: Interpreting the Data Using the project goals, compare the actual results with the desired results, including comparing the baseline indicator with the output indicator. Comment on how successful or otherwise the work plan was. Comment on unexpected results and further work needed. Make recommendations based on the results for changes to current work and the potential for future work. 10

11 7. Reporting the Evaluation Results The level of detail needed in the evaluation report depends on the audience. A report for use within the partnership only can be shown succinctly as a SWOT diagram as seen below. However, a report used for a funding application may require specific level of details. See Appendix B for recommended sections to include in a more extended report. STRENGTHS (INTERNAL POSITIVE) Topline successful areas of the project which were controlled by the partnership. WEAKNESSES (INTERNAL NEGATIVE) Topline unsuccessful areas of the project which were controlled by the partnership. OPPORTUNITIES (EXTERNAL POSITIVE) Topline successful areas of the project which were not controlled by the partnership. THREATS (EXTERNAL NEGATIVE) Topline unsuccessful areas of the project which were not controlled by the partnership. 11

12 Appendix A Example of Consent Form An explanation of the project including who is leading it. I agree to take part in the evaluation of. I understand that I can withdraw at any time, before during or after I have given any information, and this will not affect the services I am offered. (If appropriate) I give my permission for the interview to be recorded and transcribed, and to be used for this project alone. I understand that any information that could personally identify me will be removed before publication. Name Signature Date 12

13 Appendix B Recommended Sections to Include in an Evaluation Report 1.. Title Page a. Title of project b. Name of organisation c. Date 2. Executive Summary a. One-page concise overview of what was evaluated, what the results were and what the recommendations are. 3. Table of Contents 4. Introduction a. Information about the organisation b. Why the evaluation took place c. Consent forms (if appropriate) 5. Aims and Objectives a. What was hoped to gain from the evaluation? 6. Methodology a. What evaluative method(s) were used and why? b. What were the restrictions or limitations of this methodology? 7. Results a. Qualitative and / or quantitative results TOOLKIT FOR EVALUATION IN EMPLOYABILITY 8. Conclusion a. What does the data mean in terms of the original goals? b. What was expected and unexpected? 13

14 8. Conclusion c. Recommendations for: i. Current work Ii. Future work Iii. Evaluation procedures 9. Appendices a. Questionnaires b. Interview questions c. Focus group facilitation outline d. Details of data analysis techniques 10. References 11. Bibliography Good luck with your evaluations. Once you have created a process that works for you, share it with your colleagues and other partnerships. Useful links WBS & Partners website Info on quantitative results reporting Info on qualitative results reporting 14