FINAL EXTERNAL EVALUATION
SCOPE OF WORK FOR EXTERNAL EVALUATION OF TECH RRT PROGRAM

I. DESCRIPTION OF FINAL EVALUATION
A. PURPOSE OF THE EVALUATION
The final evaluation is being conducted to identify accomplishments, performance (internal and external), and constraints in implementation of the program and in attainment of the project’s specific objective. The results will inform and shape both the internal 2018 Tech RRT work plan and strategy but also the external Nutrition Technical Body, spearheaded by UNICEF, where specialized nutrition technical expertise is planned to be housed in the future. The evaluation will be distributed to OFDA, consortium partners, UNICEF, the Global Nutrition Cluster and their partners as well as other relevant stakeholders.

B. PROGRAM BACKGROUND
During humanitarian crises, the response capacity of governments, UN agencies, and international and local NGOs is often compromised as they struggle to find adequate technical and human resources to meet urgent technical needs. In order to respond to these inter-agency challenges, International Medical Corps and consortium partners Action Against Hunger and Save the Children, with funding from USAID-OFDA, have been providing partners with emergency response technical expertise and human resource support through its Technical Rapid Response Team (Tech RRT) since August 2015.

Goal: To improve the overall availability of capacitated emergency nutrition specialists for humanitarian technical response

Specific Objective: To improve overall emergency nutrition response

The Tech RRT project aims to improve overall emergency nutrition response by deploying technical surge advisors to major and complex humanitarian crisis and by providing remote support and building the capacity of stakeholders involved in humanitarian responses. Any country/agency facing a humanitarian crisis can request this support and advisors are deployable within 72 hours (depending on visa procedures) for a period up to 6 weeks. All Tech RRT support, whether in-country or remote, should be used to benefit the quality and/or scale/reach of humanitarian response, whether for the collective nutrition community or for individual agencies. The key criteria for deployment include 1) L3/L2 crisis, 2) humanitarian crisis including rapid onset emergency such as natural disaster, civil unrest/war or slow onset emergency, and 3) countries with limited technical capacity in nutrition in emergencies. The initiative is closely coordinated with the Global Nutrition Cluster (GNC) in Geneva and the UNICEF Program Division in New York.

C. EVALUATION QUESTIONS
The evaluation will aim to answer the following questions, organized according to OECD-DAC evaluation criteria:

Relevance/Appropriateness
• How useful were the deployments to the nutrition community where support was provided? Were the deployments relevant/timely?
• Is the composition of the team relevant and appropriate to the needs of the nutrition community?
• What does the analysis of the types of deployments show in terms of the global demand for technical support? Should other sectors be considered for integration to maintain or increase relevancy of this program? If so, which?
• How are other surge mechanism such as the Save the Children’s HST and Action Against Hunger’s SMART team complementing the Tech RRT?
• How relevant and useful is the work performed during non-deployment time to collective global and country level nutrition actors? Should it continue or should the time be restructured? (Are non-deployment time work plans relevant? Are the outputs of that work relevant and useful?)
• How is the project perceived and valued by the nutrition community? Is it seen as relevant?
• Would it be feasible and appropriate to integrate a research component within the scope of the Tech RRT, for example to incorporate applied research during emergency deployments or to test new approaches/technologies?

Effectiveness
• How effective was each component of the project (deployment and non-deployment time) in reaching their objectives?
o Were deployments effective in achieving their objectives?
o Was non-deployment time effective in providing support to the global nutrition agenda in each technical area? Are non-deployment work plans and activities well defined and well executed and carried out in a timely manner?
o Was the team composition and time allocation of the advisors effective in allowing us to meet our goal?
• Did the deployments meet the needs of the country as defined in the TORs for the assignment?
• What were the internal and external factors that affected performance and outcomes? What measures were put in place to mitigate possible negative factors? Were those measures effective?
• Was the M&E plan effective in gathering information to measure the success of the implementation of the project? Were the project indicators appropriate for that purpose? Are there other indicators that could better measure this?
• Was the Tech RRT effective in coordinating and liaising with relevant partners/bodies? Were there others that should have been included?

Efficiency
• How efficient is the Tech RRT at responding to requests? Does the operational model facilitate rapid deployments?
o Were we efficient in deploying advisers, examining the whole process from the initial enquiry, the deployment arrangements, the approval process with the donor and the deployment steering committee, to the adviser finally arriving in country? Are there other models that could be considered?
• How well were resources (funds and time) used to achieve results?
• How visible or well-known is the Tech RRT, particularly among those that would most likely request this support or recommend it to others?

Sustainability
• What financial models could be appropriate and the most sustainable for this type of surge mechanism that serves the collective?
• Could a cost recovery model be an appropriate option for the Tech RRT? What would be the advantages and drawbacks of such a model?
• What are funding mechanisms/sources that could be viable to finance the Tech RRT? (e.g. crowd funding, potential other donors)
• How could the Tech RRT mechanism become a sustainable resource for emergencies? Could it realistically be housed within the Technical Body for Nutrition that UNICEF is planning to establish in the near future? What advantages and disadvantages would this carry with it?

Impact
• How has impact been measured in the project and how can this be improved?
• What has happened as a result of this project or its deployments?
• What real difference has the project made on humanitarian response broadly (either globally or at a country level)?
• Has the project made a real difference for people affected by emergencies? If so, can some examples be highlighted to demonstrate this?
• Is there a way to better understand or estimate how many people are affected by the project?
• What were unexpected results/outcomes of this project?

D. EVALUATION METHODS
In the first days of the evaluation, the consultant will propose and develop the specific evaluation methods; however, these should include the following:
A. Review of project and other relevant documentation: core program documents, monthly reports, end of mission reports, deployment evaluations, documents regarding the Technical Body for Nutrition, etc.
a. Core program documents: proposal and amendments, SoP, checklists, pipeline etc.
b. Performance Monitoring Data
c. Monthly Reports
d. End of Mission Reports
e. Deployment Evaluations
f. June 2016 and January 2017 Face-to-face Meeting Report
g. Annual donor report 2016 and 2017
h. Website, Twitter
i. External presentations (OFDA, GNC, etc.)
j. Other relevant documentation: Core Humanitarian Standards, Cluster Technical Taskforce documents, Rapid Response Teams Cost Efficiency Study, Others
B. Self-administered survey: these should be undertaken by Tech RRT advisors, steering committee members and other relevant stakeholders based on above evaluation questions.
C. Field visit to a country receiving multiple deployments: in order to track the use of deployment inputs and deliverables and to get an in-depth perspective on the longer-term effects of the support provided
D. Key informant interviews:
a. At the global and regional levels with Tech RRT Advisors and Steering Committee, OFDA/USAID, GNC, UNICEF, etc.
b. At the country level with government authorities (MoH), Nutrition Cluster Coordinators, in-country supervisors, staff involved with deployments, etc.

II. SAMPLING METHODS & SELECTION CRITERIA FOR INDIVIDUALS AND/OR GROUPS INTERVIEWED
This evaluation will be carried out as a qualitative assessment and purposive sampling will be used as the main sampling method. Selection criteria is based on individuals that interacted with the Advisors during deployment, as well as other key nutrition community stakeholders who will critically evaluate the Tech RRT mechanism.

III. DATA ANALYSIS PLANS
Since the data collected will be qualitative for the most part, qualitative analysis methods and software will be used to analyze the data. Should the evaluator decide to do any quantitative data collection, they will submit a plan for such analysis at the beginning of the consultancy.

IV. PRIVACY MEASURES AND PLANS FOR ENSURING PROTECTION AND CONFIDENTIALITY DURING DATA COLLECTION
All evaluation materials that are shared with the Tech RRT, i.e. data from interviews, case studies, etc., will be made anonymous by the evaluator prior to sharing to ensure confidentiality; therefore, any agency or individual specific information will only be available to the consultant. The evaluation report will only include non-sensitive information and will not be agency or individual specific.

V. DELIVERABLES
A preliminary and final evaluation report will be developed which should adhere to the OECD Development Assistance Committee (DAC) evaluation criteria and their standards in data analysis and reporting.
Preliminary and Final Report
• Executive Summary
• Introduction
• Methodology
• Relevance/Appropriateness
• Effectiveness
• Efficiency
• Sustainability
• Impact
• Conclusion and Recommendations
• Annexes

The following critera should be used to rank the performance of the overall intervention using the DAC criteria and this should be included as an annex to the report;
Impact
Sustainability
Coherence
Coverage
Relevance/Appropriateness
Effectiveness
Efficiency

2-3 best practices should be identified from the project and briefly documented, in order to later potentially develop case studies of them; these can be included in the body of the report, but at a minimum, they should be included in the annex.

VI. PROFILE
• Good knowledge of nutrition and particularly of nutrition in emergency contexts
• Significant field experience in the evaluation of humanitarian projects
• Relevant degree / equivalent experience related to the evaluation to be undertaken
• Significant experience in coordination, design, implementation, monitoring and evaluation of emergency programs
• Good communications skills and experience of undertaking qualitative data collection
• Ability to write clear and useful reports (may be required to produce examples of previous work)
• Fluent in English
• Ability to manage the available time and resources and to work to tight deadlines
• Independence from the parties involved

VII. EVALUATION SCHEDULE

Timing Proposed Activities
November 13 2017 Begin evaluation
November 30 2017 Finish review, report writing
December 8 2017 Submission of first draft; review by Steering Committee
December 13 2017 Incorporate comments and prepare final report
December 15 2017 Report finalized

Estimated Work plan and timetable

Review of project and external documentation: 3 days
Development of methodology and preparation of study tools: 1 day
Data collection - remote interviews and survey: 5 days
Data collection - country visit (interviews, observations, etc.); 7 days
Analysis and drafting of report: 7 days
Incorporate comments and feedback and finalize report: 2 days
Total 25 days

If you are interested in the position please apply through International Medical Corps - Careers before the 15 November 2017:

Cette question est verrouillée. Aucune autre réponse ne peut être ajoutée.