Consultancy to Support External Evaluation of Serve Project at Concern Worldwide

Posted 1 day ago - By Kenya Vacancies - Over 5 Potential Applicants

  • We are an international humanitarian organisation dedicated to tackling poverty and suffering in the world’s poorest countries. Concern Worldwide began working in Kenya in May 2002 with the development of an urban programme in Nairobi. Our work expanded into a multi-sectoral programme focusing on urban and rural livelihoods, primary education, HIV and ...

    Read more about this company

     

    Consultancy to Support External Evaluation of Serve Project
    • Job TypeFull Time
    • QualificationBA/BSc/HND , MBA/MSc/MA
    • Experience5 years
    • LocationNairobi
    • Job FieldConsultancy&nbsp , Research&nbsp

    Objectives and Specific Tasks to be Undertaken by the Consultant(s)

    • The evaluation will use a mixed methods approach, integrating both quantitative and qualitative data. The evaluation will be anchored on the OECD-DAC criteria as presented above and led by an external consultant/firm.

    The evaluation methodology will encompass, but is not limited to, the following:

    • Review all quantitative data on program indicators, including a comparison of baseline and end-line data, and analyze this data about the evaluation questions outlined above. Additionally, the evaluators will examine performance monitoring data collected throughout the BHA awards to evaluate the systematic implementation of results-based monitoring within the awards. This analysis will be based on data collected by Concern throughout the awards; hence, no further quantitative data will be collected.
    • The external evaluators will lead the qualitative data collection to address the evaluation questions using Focus Group Discussions (FGDs) with program participants and Key Informant Interviews (KIIs) with staff and stakeholders. The qualitative approach will also explore issues identified in the quantitative review, focusing on the effectiveness of the FCRM system, the sustainability of food assistance, agriculture (particularly livestock), WASH, nutrition, and barriers faced by men, women, and marginalized groups. This process will consider societal structures, demographics, and gender and age factors to inform future program design.
    • The sampling approach for primary data collection will be non-probabilistic and purposive, aligning with its qualitative focus. This approach aims to capture a diverse range of perspectives on specific aspects of the implementation of the Awards. The evaluation team is encouraged to propose a sampling strategy with distinct samples for focus group discussions in each sectoral intervention (Food Assistance, Agriculture, Nutrition, and WASH). The strategy must ensure the inclusion of female participants and Marginalized groups. The team should determine a sample size sufficient to achieve saturation on the evaluation questions, based on their analysis of the project and scope of work. The evaluation team is encouraged to propose a purposive sampling approach for staff and stakeholder interviews to achieve saturation on the evaluation questions. Concern will fully support and facilitate the sampling process to ensure the effective collection of evaluation data.

    Desk-based research/preparation

    The evaluation team will review program documents, including proposals, progress reports, monitoring records, and distribution data, alongside Concern’s quantitative data aligned with BHA indicators. They will also analyze reports from other humanitarian organizations and sector-specific documents, such as SMART Survey Reports, Rain Assessment Reports, National Drought Management Bulletins, Disease Surveillance Reports, and Kenya Health Demographic System reports for Food Assistance, WASH, Agriculture, and Nutrition sectors.

    Field-based research

    • In the field, Concern’s Program Managers and the MEAL Team will support the evaluation by coordinating interviews and discussions with program participants, non-participants, and staff through household interviews and focus group discussions (FGDs). Key informant interviews with stakeholders, including program participants, local authorities, humanitarian actors, and the line departments at the county level, i.e., the Department for Environment, Water and Natural Resources, Department for Agriculture, Fisheries and Livestock Development, and Department of Health. The external evaluators will be responsible for proposing a sampling methodology for the Concern.
    • After the fieldwork, time will be allocated for the evaluator to analyze and review the collected data, draft the report, and refine subsequent versions based on feedback from partners, Concern, and other peer reviewers. As part of Concern’s commitment to downward accountability, the organization will ensure that the evaluation findings are effectively disseminated to program participants and other stakeholders. This process will take place at multiple levels:

    Community Level Dissemination:

    The findings will first be shared directly with program participants through community consultations conducted in all program locations. The program teams to ensure that the information is presented in a clear, accessible, and culturally appropriate manner will facilitate these sessions. This will take place for over one month. This approach ensures that participants are informed about the outcomes and have the opportunity to provide feedback.

    County-Level Forum

    Concern will organize a dissemination forum at the county level to share the evaluation findings with key stakeholders. This forum will involve various county departments, including relevant technical teams, as well as Concern’s County program staff and local partners. The objective is to foster a collaborative discussion about the results, lessons learned, and implications for future programming within the county.

    National-Level Forum

    Finally, Concern will convene a national-level forum to present the evaluation findings to a broader audience, including national stakeholders, program staff, relevant authorities, and the donor. This forum will provide an opportunity for strategic dialogue, policy alignment, and resource mobilization to ensure the sustainability of program outcomes and the integration of lessons learned into national-level planning.

    Outputs

    • The evaluator will be fully responsible for the following:
    • A concise inception report outlining the data collection methodology (qualitative and quantitative), data collection tools (checklists and questionnaires), work plan, and submission and review timelines. The report will also detail the proposed sampling approach, including sample sizes and the planned number of focus group discussions and key informant interviews for each respondent category.
    • A first draft of the Evaluation Report with an executive summary and clear recommendations (complete fewer appendices) for comment from Concern Kenya within 1 week of concluding interviews.
    • A presentation of findings to the Concern Kenya country team upon completion of the draft Evaluation Report.
    • A full final draft of the Evaluation report, integrating the feedback received within one week of receiving consolidated feedback on the draft report.
    • The report, in English, should be 10-15 pages long without appendices and should be submitted in electronic format (Word or PDF) to the country program and include the following sections:
    • Executive Summary (maximum 2 pages)
    • Brief context and description of the intervention
    • Presentation of evaluation methodology and any limitations encountered
    • Presentation of main findings, conclusions, and recommendations using graphs, charts, and tables where appropriate
    • All information should be disaggregated as per the PRIS for the BHA award, including where permissible and logically presenting the disaggregation per location.
    • The analysis should combine quantitative data outcomes with identified issues to inform future strategies for addressing barriers in humanitarian interventions and enhancing program impact.
    • Recommendations emphasizing key lessons learned to enhance Concern’s future emergency multi-sector program planning, implementation, and management responses
    • Scoring against the extended DAC criteria
    • Annexes: including the Terms of Reference (ToR), a list of people and groups consulted, interview frameworks/questionnaires, data collection tools, a list of sites visited, abbreviations, and any maps, charts, or graphs used in the evaluation.

    Essential and Desirable Experience/Qualifications

    • Master’s degree (preferred) or bachelor’s degree (minimum) in Development Studies/Social Sciences/Statistics or Applied Research Methods/Livelihoods or social work.
    • Minimum experience of 5 years conducting evaluations along USAID/ BHA OECD evaluation criteria, ideally leading an evaluation team, and experience in designing evaluation methodology/tools, data analysis
    • At least 5 years’ experience using impact evaluation tools and methodologies
    • At least 5 years’ experience working in development contexts in Arid and Semi-Arid (ASAL) areas in Kenya
    • In-depth knowledge and at least 5 years’ experience in using quantitative and qualitative research methods
    • Individuals or firms with a background and at least 5 years’ experience in research methods, Livelihoods/Nutrition/Social Work, or development studies
    • Experience in writing evaluation reports to a high standard, in English

    Method of Application

    Interested candidates, who meet the above requirements, should submit their proposals by email to; Consultancies.Kenya@concern.net With the subject line “SR109729 – Consultancy for External Evaluation of SERVE Project.” by 4:00 pm on 24th April 2025.

  • Apply Before: 28 April 2025
    Apply Now