Design Monitoring and Evaluation Coordinator

Date limite: 2024-07-26

Lecture estimée à : 7 minutes

Offre Expiré

Description du poste

Lien tdrs

Employee Contract Type:
Local - Fixed Term Employee (Fixed Term)

Job Description:

MAJOR RESPONSIBILITES

Planning in the Programme Cycle

Facilitate the development Area Programme Plans in a way that builds partner capacity to ensure sustainability. Make recommendations for the log frame development, based closely on relevant information.
Facilitate a participatory review of logical flow and consistency within the hierarchy of objectives and assumptions. Contextualize indicators drawing on community conversations as appropriate Identify key areas for monitoring according to the log frame. Identify and contextualize methodologies for appropriately measuring indicators using participatory methods where possible
Work proactively with key staff and stakeholder to develop detailed monitoring plan for the technical programme, ensuring clarity of roles and responsibilities
Share monitoring and evaluation plans with key stakeholders and incorporate their feedback as appropriate. Seek approval for monitoring and evaluation plan from program design team lead as necessary
Support planning for effective Surveys (Evaluations and Baselines) for APs transitioning or entering new phases within the Cluster
Support Area programs to plan and coordinate M&E activities
Lead the process of annual planning and budgeting by all the APs and Grants depending on their implementation calendar and phases

Technical Support

Provide support to AP teams and partners in data processing, consolidation and analysis across the local programming areas.
Provide oversight on scheduling and budgeting for monitoring processes according to the needs of the technical projects (TPs).
Support Projects to ensure real-time monitoring and appropriate data storage for timely utilization by staff and partners to show progress and issues at Area Program, Cluster and national level.
Provide constructive/technical support to area programme managers on monitoring data generation to ensure that information produced is relevant and useful to the difference stakeholders.
Share lessons learned from technical programmes with area programmes to enhance local planning and implementation. Suggest modifications that might be necessary at the local level based on evidence and lessons learned
Support the establishment and ensure use of quality monitoring systems and processes within the Cluster

Visit field offices to monitor GIS operationalization in programs/projects and collect data

Quality Assurance and Management through the Programming cycle

Ensure the sustained implementation of all DME related processes including LEAP/
HORIZON, STEP, GIS/WVDPA /HAP/SPHERE at the Area Program and Cluster operational levels;
Ensure that D,M&E related activities - assessment, design, monitoring, baselines, evaluations, transition, and documentation - are successfully implemented as per standard and disseminated to staff and used to inform future ME processes and decision making.
In relation to the AP plans: Verify that collaboration with partners is appropriate to their capacity. Ascertain that the specific needs of the most vulnerable children are addressed. Verify the inclusion of collaboratively developed sustainability and transition plans with associated agreements with partners
Ensure harmonized timely and comprehensive routine and non- routine monitoring system for quality data as per the Technical program needs at the county and Cluster level

Management Reporting of Progress and Performance

Facilitate/socialize Area Program teams’ understanding of requirements for a management report, according to LEAP/WV procedures. Guide the teams in the use of data management system efficiently to access relevant and necessary data for report generation
Review the analyzed and interpreted data from across the area programmes in the Cluster to collate learning and use the monitoring data developed during the reporting period as the basis for the programme report.
Facilitate review sessions of participatory monitoring data to identify key messaging for the report
Coordinate collection of feedback from technical report reviewers to Area Program managers. Check the report for accuracy and quality to ensure they meet report LEAP requirements. Provide Technical support to program managers on uploading reports on Horizon
Support documentation of special donor reports or documentations of best practices or success stories
Lead the process of semi and annual reporting by all the APS and Grants depending on their implementation calendar and phases
Compile and Validate monthly and Quarterly Management report

Capacity Building

Support implementing staff to have required capacity (skills, knowledge & attitude) in DM&E processes and utilize it for enhanced programming.
Promote an enhanced culture of learning, innovation and practice of LEAP/DME discipline and Empowered World View
Lead in the implementation of new program effectiveness initiatives to staff within the Cluster e.g. Horizon Wave 2, WVDPA, EWV
Support in building capacity of implementing staff to integrate Programme and accountability framework within Area Programs
Support Local Institutions capacity building initiatives
Support Programs/Projects sustainability and transition (S&T) planning and implementation

Surveys (Evaluation/baselines) and documentation

Participate actively in scoping discussions with key stakeholders, in an analytical and well-informed way to inform program evaluation TOR. Build capacity of field level teams in understanding the relevant information for the basic information sheet.
Support the Area Programs teams in logistics planning for the training of data collectors and the data collection in the field in accordance with ToR. Critically analyze and approve plans as necessary to respond to field reality, in collaboration with the evaluation lead or and Cluster Program manager
Coordinate the review and provision of feedback on standard tools in consultation with National Office DME Manager
Participate in the field testing of tools to ensure evaluation lead has adopted a technical sound methodology to ensure accuracy in data collection. Provide critical, review and approve the final tools to be used for the evaluation after feedback has been received
Co-facilitate training for data collectors and data entry clerks, in a way that supports effective learning. In collaboration with the Area Program Cluster support the smooth running of the data collection process.
Support AP storage of collected data securely as per policies relating to data protection for all programs at the Cluster level. Deal with and/or escalate technical issues and problems proactively as they arise, adapting plans as appropriate. Participate actively in reflection sessions and provide constructive critical feedback as necessary
Participate in data analysis, interpretation and validation with relevant stakeholders according to the indicator analysis plan
Participate in feedback processes, with community, staff and partners in a context sensitive way. Co-facilitate reflection and learning with staff and partners to refine interpretation of the findings. Review and provide considered feedback on the baseline or evaluation report, as requested
Provide input for products appropriate for different audiences to share the findings and recommendations, as requested. Participate in reflection with DME team to review and refine learnings about the evaluation process
Contribute to identification, protocol setting and monitoring of action researches integrated to Technical Programmes or their models

QUALIFICATIONS FOR THE ROLE

Bachelor’s degree in a relevant field from a recognized University, preferably in Social Sciences (Project Planning and Management, Statistics, Development Studies, Monitoring and Evaluation, Impact Assessments and Accountability) or related studies from a recognized university. An advanced (Masters) degree will be an added advantage.
Certification in DME Appropriate systems and software; SPSS, STATA, Epi Info, Ena for SMART, R, Windows Excel, among other Qualitative and quantitative analysis technics
Certifications in monitoring, Evaluations, data analysis, Project Management and or in documentation/knowledge management, Programme Management for Development Professional (PMD Pro),
Effective in written and verbal communication [English and French].

EXPERIENCE REQUIRED

Minimum of 5 years’ active experience in project cycle management
Extensive conceptual understanding of and demonstrated practical command for implementing program design/Logical Approach, management and evaluation principles;
Experience in integrated Programme/Projects design and associated tools development
Demonstrated ability to train and build capacity of other staff and grassroots institutions for effective Programme delivery
Must have knowledge and practical experience in Research, organizational learning and documentation and have good writing and editing skills.
Ability to design and coordinate surveys (feasibility studies/assessments, baselines, evaluations; using both online and offline platforms
Good level of proficiency in data analysis techniques/software and data management
Ability to design M&E tools, surveys, surveillance systems, and evaluations
Strong interpersonal skills and managerial capacity [Preference] 5. Ability to build capacity of staff on relevant technical fields
Proficiency in written and spoken English.
Good interpersonal, organizational and management skills
Ability to solve complex problems and to exercise independent judgment
In-depth knowledge and understanding of WV working systems, policies and standards will be an added advantage.

Applicant Types Accepted:
Local Applicants Only

Profils sociaux: