Design, Monitoring & Evaluation Coordinator

Size: px
Start display at page:

Download "Design, Monitoring & Evaluation Coordinator"

Transcription

1 Design, Monitoring & Evaluation Coordinator Location: [Africa] [Kenya] Town/City: Karen Category: Programme Effectiveness Purpose of the position: Based at the Nairobi office, this position exists to support the design, planning processes, program & strategy performance monitoring and generating evidence for learning and reporting in line with World Vision Somalia Office Strategy and donor requirements. Specifically, to provide DME technical support to Program Development Unit and Technical Unit teams in the assessment and design processes of projects, ensure evidence based design, planning & results based reporting, participate cluster meetings where DME expertise is required, engage with the TAs to ensure that sector specific data are translated into technical reports for engagement with both internal and external stakeholders, assure quality on reporting processes and products, support staff and partner capacity strengthening in DME based on the prioritized needs, use the information generated to promote learning and reflection culture within WVS and among partners. Major Responsibilities: Assessment, Design & Planning and Resource acquisition Ensure program designs & plans are informed by assessment findings and recommendations Support resource acquisition team through review of concepts and proposals for overall quality and consistency of designs and plans for logic, coherence, clarity and

2 provide feedback and assistance to improve, paying attention to such concerns as a sound and clear logical framework with SMART indicators to provide evidence during and after implementation. Coordinate with the field DME Managers to provide relevant assessment data for use in proposal development exercises as appropriate. Engage with TAs in the design of assessments and provide technical guidance in conducting assessments, baselines and evaluations in line with established LEAP or donor related guidelines at a National level. Engage with TAs to promote, support and strengthen adoption and use of Lot Quality Assurance Sampling (LQAS) methodology using Open Data Kit (ODK) or any emerging recommended software for rapid assessments and annual monitoring of projects outcomes Monitoring, Reporting and Evaluation Coordinate with Programme Officers quarterly and annual report review processes in line with the national DME calendar Analyze program data and periodic reports from various projects into periodic management reports usable for demonstrating evidence towards progress on National Office Strategy, for Technical Project Managers, senior management, Senior Leadership and other WV partnership offices. In collaboration with the field DME Managers, support projects to undertake activity tracking and results based monitoring of outputs and outcomes using standardized reporting templates/ formats, protocols, guides, dashboards and databases ensuring contribution to the National Office Strategy are well captured Page 2/7

3 Coordinate the process of quarterly and annual outcome monitoring for all projects ensuring that all participate in the national outcome monitoring process using LQAS methodology and the adoption of ODK and GIS platforms Ensure program (IHA) accountability and other cross cutting themes are mainstreamed in project reporting In collaboration with the Communications team and the field DME Managers, support project teams to generate impact / Most Significant Change evidence in form of text (magazines, fact sheets) or digital videos. Synthesize and summarize trends in project monitoring data including those generated from evaluations across sectors that can be used to inform national level programming, write monthly quarterly, semi-annual and annual, and reporting the national office child wellbeing report and department score cards to track their contribution to NO strategy and CWBOs. Develop and manage an updated database of strategy outcome and standard output indicators for tracking strategy progress and WV Somalia contribution to CWBOs Track planned evaluations and ensure that evaluation reports are obtained and timely shared to facilitate learning internally and across the partners Learning and external engagement Be the QA focal point for Nairobi based cluster and technical working meetings where QA/DME expertise is required Consolidate lessons learned documentaries and share with partners and

4 communities to promote use of knowledge generated from project level community reflections, project monitoring and evaluations with a view to improving future program selection, design and implementation as well improvements in existing frameworks Take part in monitoring and evaluation networks (eg ReDSS) with other partners as well as regional and national team to stay abreast with best monitoring and evaluation practice and to support quality programming and accountability standards. Provide regular feedback to partners and staff to improve quality of documents at their source. Enhance corporate learning, capacity building and codification of knowledge through regular contributions on the project monitoring, evaluation and reporting communities of practice. Provide documentation on case studies and success stories. Evidence creation and utilization In collaboration with TA and TU team, generate sector specific information and reports for the production of the national office CWBO report Support and promote evidence based learning at National Office and project level through undertaking operations research and documentation. Support TAs and projects in documenting and disseminating innovations and new research findings related to WV work in Somalia

5 Generate data summaries and visualization to demonstrate results to the NO and regions /zones on the contribution of the projects to the No office strategy. Promote utilization of data / information at sector and national level through cluster, senior management meetings or conferences Conduct meta evaluation of all completed project annual reports, assessments, baselines, evaluations and special re to establish key learning and recommendations to inform future evaluations Contribute to research processes and needs assessments based on the national office priorities. Quality assurance Support projects undertake programme effectiveness self-reviews Perform periodic data quality assessments to ensure validity, integrity, precision, reliability and timeliness of all project performance data; identify any deficiencies and suggest corrective actions; and assist the technical team members to maintain electronic and hard copy files. Monitor and follow up on utilization of recommendations from quarterly, annual and evaluation surveys across the national office projects In collaboration with the DMEO (KM) track timely implementation and reporting of programs plans Review concepts notes, proposals, management reports as well as other M&E reports to ensure data validation and compliance with recommended guidelines,

6 formats and standards. Compile a graded national summary of reports based on the report quality review tool with specific recommendations on key findings and learning for improvement. DME Capacity Strengthening In collaboration with the QA & Strategy Manager, develop and implement a mentorship program for DME and project teams aimed at enhancing their capacity and skills to assume increased evidence based reporting, M&E and programming responsibilities In collaboration with the regional DME Managers, strengthen the capacity of staff to utilize existing M&E tools for proper tracking and reporting in line with World Vision Somalias M&E Framework including coaching and mentoring for staff, partners and other stakeholders on Design, monitoring, evaluation & research. In collaboration with the regional DME Managers, provide technical assistance and support to project teams in monitoring of projects and preparing results focused and evidence based project reports (monthly reports, quarterly progress reports, annual project reports, inception reports). Qualifications: Education/Knowledge/Technical Skills and Experience Educational level required: A minimum of a university degree in Public health, Statistics, Quantitative Economics, Development Studies, Social Sciences, Community Development, or any related field. Postgraduate training in monitoring & evaluation is an added advantage. Technical Training qualifications required: M&E certification from any recognized institution

7 Powered by TCPDF ( or gained from work based training Computer literacy is required working knowledge of MS Word, Excel, Power point, at least one statistical package as SPSS, STATA, Excellent analytical and report writing skills Ability to train others and effectively facilitate meeting is required. Professional technical skill desired: Working knowledge of the M&E industry standards, Analysis using qualitative and/ quantitative methods including use of participatory methods and tools for planning, monitoring & evaluation, SPHERE standards, Durable Solutions Framework, Code of Conduct for Red Cross/ Red Crescent, Humanitarian Accountability Partnerships (HAP); other international humanitarian standards and other capacity building skills. Experience: Minimum of four years work experience in monitoring and evaluation of development and humanitarian programmes or work with INGOs or humanitarian agencies. Working Environment / Conditions: Work environment: 70% Office-based with frequent travel to the field Travel: 30 % Domestic/international travel will be required to the field in Somalia. Page 7/7