CONSULTANCY SERVICES FOR ENDLINE EVALUATION TERMS OF REFERENCE 107 views0 applications


ENDLINE EVALUATION TERMS OF REFERENCE

(SOMALILAND DURABLE SOLUTIONS CONSORTIUM)

Abbreviations

SDSC Somaliland Durable Solution Consortium

WVS World Vision Somalia

NRC Norwegian Refugee Council

CWW Concern Worldwide

DRC Danish Refugee Council

DAC Displacement Affected Community

IDP Internally Displace Person

ToR Terms of Reference

RNG Random Number Generator

GIS Geographical Information Systems

DME Design Monitoring and Evaluation

FGD Focus Group Discussion

KII Key Informant Interviews

ReDSS Regional Durable Solutions Secretariat

1.0 Introduction

1.1 Evaluation Summary

Project name (s)

“Wadajir”-Enhancing Durable Solutions for and Reintegration of Displacement Affected Communities in Somaliland

Project goal

Create a conducive environment for displacement (or mixed migration) affected communities in Somalia to reach a durable solution

Project outcomes

Outcome 1: DACS are able to influence decisions, policies and agreements that affect them collectively as well as where to live and how they are governed.

Outcome 2: DACS have improved access and use of basic services/material safety as other non-displacement affected communities.

Outcome 3: DACS have improved access to adequate livelihoods through generating income and assets, gainful employment, and managing financial risk as other non-displacement affected communities.

Outcome 4: Learning on best practices and lessons on Durable Solutions disseminated by Somaliland Durable Solutions Consortium programming are utilized by actors and stakeholders working in Somaliland.

Target beneficiaries

63917 DAC members; 30,000 people affected by displacement (19,662 IDPs, 2,100 returnees, 8,238 members of host communities) in Hargeisa and Burao focusing on male and female youth, women and children in refugee camps and IDP settings as well as persons with special needs (e.g. elderly, other vulnerable groups).

Project locations

Burao: Ali Hussein, Adan Sulieman, Aqil yare and Koosar, and 7 health facilities in Burao District.

Hargeisa: Digaale, Statehouse, Jimcaale, Ayah 4 and Ayah 2, Ayah B1,

Project duration

March 2017-February 2020

Evaluation type

Endline Evaluation

Evaluation purpose

The purpose of the endline evaluation is to document and inform the key project stakeholders (donors, partners and beneficiaries) of the project progress with reference to the OECD/DAC evaluation criteria on relevance, effectiveness, efficiency, sustainability and impacts in relation to project results. The endline evaluation will also help to draw key lessons learnt and the best practices to the project stakeholders.

Methodology

The endline evaluation will adopt mixed methods design including quantitative and qualitative techniques as summarised below:

Quantitative

  • Beneficiary household surveys

Qualitative

  • Focus group discussion (FGD)
  • Key Informant Interviews (KII)
  • Document Reviews and Case studies

Expected Evaluation time: 15th January-28th February 2020

2. Description of Projects Being Evaluated

The Somaliland Durable Solutions Consortium (SDSC), a consortium of five agencies led by World Vision, including the Danish Refugee Council (DRC), Norwegian Refugee Council (NRC), Concern Worldwide (CWW) and Taakulo Somaliland Community (TASCO) has been implementing a 3 years “Wadajir” project in Hargeisa and Burao Districts of Somaliland. The project started in March 2017, and it will come to an end in February 2020. The goal of the project is to contribute to the creation of a conducive environment for communities in Somalia affected by displacement or mixed migration enabling them to reach durable solutions. To this end, the project works towards increased access to essential services and creation of realistic livelihood opportunities in the main access areas of return and departure in Somalia. The project is undertaken together with selected governmental line ministries (Ministry of Planning, Ministry of Education, Ministry of Resettlement, Rehabilitation and Reconstruction), District authorities, community leaders and all other relevant stakeholders.

The project will come to an end in February 2020; therefore, this Terms of Reference (ToR) has been prepared to undertake an endline evaluation of the project performance for the implementation period. The evaluation will be conducted by an external Consultant, through a participatory approach involving the partner organisations, line ministries and the beneficiary communities. The evaluation results will help the key stakeholders measure the level of project success with reference to service delivery to the project beneficiaries.

2.1 Project Goal

The main project goal is to create a conducive environment for displacement (or mixed migration) affected communities in Somalia to reach a durable solution.

Project Indicators:

Level

Indicator

Impact

Percentage of returnees received over the past 12 months and are willing to stay in place of origin, disaggregated by sex and age (+/-26).

Percentage of IDPs in the area of intervention integrated into the host community with equal access to resources, disaggregated by sex and age (+/-26).

Percentage of youth with intentions to stay in place of origin, disaggregated by sex and age (+/-26).

Percentage of people in the host community with a change in perception towards promoting co-existence, disaggregated by sex and age (+/-26).

Outcome 1: Displacement Affected Communities (DACs) are able to influence decisions, policies and agreements that affect them collectively as well as where to live and how they are governed.

§ Percentage of target population in community groups with the ability to address or voice their concerns and engage in advocacy

§ Number of effective and accessible mechanisms in place to ensure access to land and/or secure tenure (housing, land and property rights)

§ Percentage of DACs who believe that the government is responsive to their rights and needs.

Outcome 2: DACs have improved access and use of basic services/material safety as other non-displacement affected communities

§ Percentage of target population who are able to achieve an adequate standard of living.

§ Percentage of DACs with access to basic health care.

§ Percentage of target population that reports feeling safe in their community.

Outcome 3: DACs have improved access to adequate livelihoods through generating income and assets, gainful employment, and managing financial risk as other non-displacement affected communities.

§ Percentage of the unemployed among displaced compared to the resident population, disaggregated by sex and age (+/-26)

§ Percentage increase/decrease in mean income per month for displaced population by job type, disaggregated by sex and age

§ Percentage of target population having obtained a loan when needed (+/-26).

§ Percentage of households that report increased household income.

Outcome 4: Learning on best practices and lessons on Durable Solutions disseminated by SDSC programming are utilized by actors and stakeholders working in Somaliland

§ Number of learning recommendations that are incorporated by state or federal government policies after learning has been disseminated.

§ Number of learning recommendations that are incorporated by regional durable solutions institution policies after learning has been disseminated.

§ Number of learning recommendations that are incorporated by district government actions after learning has been disseminated.

Output level indicators

§ Number of community representation structures in place.

§ Number of Community Action Plans (CAPs) drafted and approved for implementation.

§ Number of effective and accessible mechanisms to address Housing Land and Property (HLP) disputes relevant to displacement.

§ Number of DACs target population with access to essential health care and protection services in comparison to the host community.

§ Number of health facilities with essential minimum health care services.

§ Number of healthcare workers (midwife, nurses, community health and female health workers) trained.

§ Number of Gender-Based Violence (GBV) service providers receiving training who demonstrate improved knowledge and attitudes in GBV practice.

§ Number of community outreach activities that include information about the locations and benefits of timely care for sexual assault survivors and other forms of GBV.

§ Number of police and judicial personnel in returnees /refugee areas trained on prevention of GBV and human rights violation (compared to national standards) effectively.

§ Number of Technical, Vocational Education and Training (TVET) centres established or rehabilitated (government or community).

§ Number of TVET centres with updated curriculum and enhance understanding of the market and linkages to employers.

§ Number of teachers and management staff in the TVET institutions trained.

3. Evaluation Target Audiences

The endline project evaluation is intended to benefit multiple stakeholders that have been involved directly or indirectly in the project implementation process. In particular, the following are the key stakeholders that will be involved in the evaluation process;

  • Project beneficiaries including IDP and host communities in Hargeisa and Burao Districts
  • Targeted health facilities in Burao District
  • Gender-Based Violence and Child Protection Committees
  • Ministry of Planning, Ministry of Education, Ministry of Resettlement, Rehabilitation and Reconstruction, Somaliland National Displacement and Refugee Agency
  • Consortium members including; World Vision, DRC, NRC, CWW and TAAKULO
  • The Regional Durable Solutions Secretariat (ReDSS)
  • World Vision Germany
  • European Union (EU).

4. Evaluation type

This is an end of project evaluation that is aimed at assessing the progress made by the project towards achieving the project goal of creating a conducive environment for displacement (or mixed migration) affected communities in Somalia to reach a durable solution. The assessment of the project impact will focus on the contribution made by the project from inception.

5. Evaluation Purpose and Objectives

The primary purpose of this evaluation is to assess the impact, appropriateness, effectiveness, efficiency and sustainability of the SDSC project. The project endline evaluation will also help to draw key lessons learnt and the best practices to the project stakeholders. In particular, the project evaluation will be shaped by the following specific key evaluation questions:

Objective: Key Evaluation Questions

Impact: What has been the impact of the project interventions on the community? Besides, establish the level of impact on the target beneficiaries.What contributions have made the projects interventions on the impact measured on the community and the target beneficiaries? What other factors and actors contributed?

Relevance

  • Community involvement and participation in the design process, goal setting, planning and implementation.
  • How equitably has the project benefited the; women, men, boys and girls, returnees, IDPs and refugees?
  • The relevance and appropriateness of project design to the needs of the community.Effectiveness
  • What are the achievements against set objectives?
  • Compare actual with planned outputs and how have outputs been translated into outcomes.
  • The evaluation shall also establish the possible deviation from planned outputs and likely outcomes.Efficiency
  • How adequate were the available resources qualitatively and quantitatively?
  • Were all the project resources utilised optimally?
  • Explore alternative low-cost approaches that could have been used to achieve similar results?
  • How could the efficiency of the project be improved without compromising outputs?
  • Assess the timeliness of implementing the project activities.
  • How adequate were the reporting and monitoring systems of the project?
  • Have the project outputs been achieved at a reasonable cost?Sustainability
  • Are there sustainability plans, structures and skills in place to ensure there is sustainability of project benefits? How adequate are they?
  • How is the community and local partners prepared to continue with the project outcomes?
  • How likely are the outcomes to be sustainable and enduring? In what ways will it leave a legacy for its beneficiaries and the communities?
  • In what ways are women and men in communities, the local partners and government stakeholder’s partners prepared to continue with the project outcome?

6. Evaluation Methodology

The evaluation methodology will be designed in alignment with World Vision’s Learning through Evaluation with Accountability and Planning (LEAP) guidelines and principles. To ensure the quality of evidence, the evaluation will be designed with reference to the Bond Evidence Principles Checklist. Specific reference will be made to the 5 key dimensions of voice and inclusion, appropriateness, triangulation, contribution and transparency.

The data collection process will apply both quantitative and qualitative methods. A detailed evaluation methodology will be designed by the external Consultant in consultation with WV Design Monitoring and Evaluation Manager, Consortium Project Coordinator and WV Germany Monitoring and Evaluation Advisor who will validate the sampling strategy and procedures.

The detailed design of methodology must include the following;

  • The evaluation design
  • Sampling for qualitative and quantitative surveys
  • Data collection instruments, protocols and procedures
  • Procedures for analysing quantitative and qualitative data
  • Data presentation/dissemination methods.
  • Report writing and sharing etc.
  • The endline evaluation should take into account the methodology of the baseline and midterm evaluation to ensure data comparability.
  • The key data collection methods will include the following, among others.
  • Document reviews including the project proposal, baseline report, quarterly and semi-annual reports, midterm evaluation report, monitoring reports and project review reports.
  • Focus Group Discussions (FGD) involving primary project participants and
  • Key Informant Interviews with the line ministries, district authorities, consortium members and community leaders, among others.
  • Quantitative survey
  • Reflection and feedback sessions with staff and partners.

The Consultant will be expected to employ mobile data collection using smartphones leveraged on Kobo toolbox. Also, Geographical Information System (GIS) solutions will also be employed in the evaluation process; ranging from data collection, analysis and presentation of results.

7. Evaluation Deliverables

The Consultant will be expected to deliver the following outputs:

  1. An inception report detailing the approach and methodology to be used and sample size calculations, a detailed execution plan, data-collection tools.
  2. Draft report submitted to WV Somalia within an agreed timeline between the WV Somalia and the Consultant
  3. Presentation of the key findings and recommendations to SDSC Consortium in Hargeisa.
  4. All indicators must be presented overall and disaggregated by sex and disability status, where appropriate.
  5. Collected data (raw) after analysis complete with variable labels and codes, and the final evaluation tools submitted to WV Somalia and alongside the final report.
  6. Final report (soft copy) and 3 hard copies submitted to WV Somalia Quality Assurance team and SDSC Project Coordinator.
  7. The Consultant should note that the Final Evaluation Report shall follow the structure below customized from the UNDP (2009) Handbook on Planning, Monitoring, and Evaluation for Development Results.
  8. The evaluation report will also be guided by Bond Evidence Principles Checklist. Specific reference will be made to the 5 key dimensions of voice and inclusion, appropriateness, triangulation, contribution and transparency.
  9. The Consultant will be required to prepare and submit an executive brief of the evaluation report with infographics summarising the key project achievements, recommendations, lessons learnt and the best practices.

7.1 Evaluation Report Structure

Title and Opening pages (front matter)-should provide the following basic information:

i. Name of the project evaluated

ii. Time frame of the evaluation and date of the report

iii. Project location (districts and country)

iv. SDSC consortium logo as well as partner organisations

v. Acknowledgments

Table of Contents-including boxes, figures, tables, and annexes with page references.

List of acronyms and abbreviations

Executive Summary

A stand-alone section of two to three pages that should:

  • Briefly describe the intervention (the project(s) that was evaluated.
  • Explain the purpose and objectives of the evaluation, including the audience for the evaluation and the intended uses
  • Describe key aspect of the evaluation approach and methods.
  • Summary of the key findings, conclusions, and recommendations.

Introduction

This section will;

  1. Provide a brief explanation of why the evaluation was conducted, why the intervention is being evaluated at this point in time, and why it addressed the questions it did.
  2. Identify the primary audience or users of the evaluation, what they wanted to learn from the evaluation and why and how they are expected to use the evaluation results.
  3. Identify the intervention (the project(s) that was evaluated
  4. Acquaint the reader with the structure and contents of the report and how the information contained in the report will meet the purposes of the mid-term evaluation and satisfy the information needs of the report’s intended users.

Description of the Intervention

This section will provide the basis for report users to understand the logic and assess the merits of the mid-term evaluation methodology and understand the applicability of the evaluation results. The description needs to provide sufficient detail for the report user to derive meaning from the evaluation. In particular, the section will;

  1. Describe what is being evaluated, who seeks to benefit, and the problem or issue it seeks to address.
  2. Explain the expected results map or results framework, implementation strategies, and the key assumptions underlying the strategy.
  3. Link the intervention to the durable solution framework
  4. Identify any significant changes (plans, strategies, logical frame-works) that have occurred overtime and explain the implications of those changes for the evaluation
  5. Identify and describe the key partners involved in the implementation and their roles.
  6. Describe the scale of the intervention, such as the number of components (e.g., phases of a project) and the size of the target population for each component.
  7. Indicate the total resources, including human resources and budgets.
  8. Describe the context of the social, political, economic, and institutional factors, and the geographical landscape within which the intervention operates and explain the effects (challenges and opportunities) those factors present for its implementation and outcomes.
  9. Point out design weaknesses (e.g., intervention logic) or other implementation constraints (e.g., resource limitations).Evaluation Scope and Objectives
    This section of the report will provide an explanation of the evaluation’s scope, primary objectives and main questions.
  10. Evaluation scope-define the parameters of the evaluation, for example, the time period, the segments of the target population included, the geographic area included, and which components, outputs or outcomes were and were not assessed.
  11. Evaluation objectives-spell out the types of decisions evaluation users will make, the issues they will need to consider in making those decisions, and what the evaluation will need to achieve to contribute to those decisions.
  12. Evaluation criteria-define the evaluation criteria or performance standards used. The report should explain the rationale for selecting the particular criteria used in the evaluation.
  13. Evaluation questions- the evaluation questions will define the information that the mid-term evaluation will generate. The report will detail the main evaluation questions addressed by the evaluation and explain how the answers to these questions address the information needs of users.

Evaluation Approach and Methods

This section will describe in detail the selected methodological approaches, methods and analysis; the rationale for their selection; and how, within the constraints of time and money, the approaches and methods employed yielded data that helped answer the evaluation questions and achieved the evaluation purposes. The description will help the report users judge the merits of the methods used in the mid-term evaluation and the credibility of the findings, conclusions and recommendations. The description of methodology will include discussion of each of the following:

  1. Data sources-sources of information (documents reviewed and stakeholders), the rationale for their selection and how the information obtained addressed the evaluation questions.
  2. Sample and sampling frame-the sample size and characteristics; the sample selection criteria, the process for selecting the sample (e.g. random, purposive); and the extent to which the sample is representative of the entire target population, including discussion of the limitations of the sample for generalizing results.
  3. Data collection procedures and instruments-methods or procedures used to collect data, including discussion of data collection instruments (e.g., interview protocols), their appropriateness for the data source and evidence of their reliability and validity.
  4. *Performance standards***-**standard or measure that will be used to evaluate performance relative to the evaluation questions (e.g., national or regional indicators, rating scales).
  5. Stakeholder engagement-stakeholders’ engagement in the evaluation and how the level of involvement contributed to the credibility of the evaluation and the results.
  6. Major limitations of the methodology-major limitations of the methodology shall be identified and openly discussed as to their implications for evaluation, as well as steps taken to mitigate those limitations.
  7. Data analysis-procedures used to analyse the data collected to answer the evaluation questions. This will detail the various steps and stages of analysis that will be carried out, including the steps to confirm the accuracy of data and the results. The report will discuss the appropriateness of the analysis to the evaluation questions. Potential weaknesses in the data analysis and gaps or limitations of the data should be discussed, including their possible influence on the way findings may be interpreted and conclusions drawn.

Findings and Conclusions

This section will present the evaluation findings based on the analysis and conclusions drawn from the findings. In particular,

Findings: This section will present findings as statements of fact that are based on analysis of the data. The evaluation findings will be structured around the evaluation criteria and questions so that report users can readily make the connection between what was asked and what was found. Variances between planned and actual results will be explained, as well as factors affecting the achievement of intended results. The assumptions or risks in the project design that subsequently affected implementation will also be discussed.

Conclusions: This section will be comprehensive and balanced and highlight the strengths, weaknesses and outcomes of the intervention. The conclusion section will be substantiated by the evidence and logically connected to the evaluation findings. The conclusion will also respond to key evaluation questions and provide insights into the identification of and/or solutions to important problems or issues pertinent to the decision-making.

Recommendations: The mid-term evaluation will seek to provide very practical, feasible recommendations directed to the intended users of the report about what actions to take or decisions to make. The recommendations will be specifically supported by the evidence and linked to the findings and conclusions around key questions addressed by the evaluation. This shall also address sustainability of the initiative and comment on the adequacy of the project exit strategy.

Lessons Learned

The report will include discussion of lessons learned from the evaluation, that is; new knowledge gained from the particular circumstances (intervention, context outcomes, even about the evaluation methods) that are applicable to a similar context. Concise lessons based on specific evidence presented in the report will be presented in the mid-term evaluation report.

Report Annexes

The Annex section will include the following to provide the report reader with supplemental background and methodological details that enhance the credibility of the report.

  • ToR for the evaluation
  • Additional methodology-related documentation, such as the evaluation matrix and data collection instruments (questionnaires, interview guides, observation protocols, etc.) as appropriate
  • List of individuals or groups interviewed or consulted and sites visited
  • List of supporting documents reviewed
  • Project results map or results framework
  • Summary tables of findings, such as tables displaying progress towards outputs, targets, and goals relative to established indicators.

8. Time frame

The overall evaluation process is expected to take 45 days including preparation, data collection, and analysis and reporting. The Consultant should be able to undertake some of the tasks concurrently to fit within the planned time-frame, without compromising the quality expected. The assignment is expected to commence on 15th January 2020, with the final evaluation report expected by 28th February 2020.

9. Authority and Responsibility

WV Somalia will establish an evaluation team to oversee all the related tasks. The DME Manager will be responsible for the overall coordination of all the evaluation tasks with the Consultant. In addition, Consortium DME Coordinator, Consortium Project Coordinator, Regional Operations Manager, Quality Assurance & Strategy Manager and WV Germany Monitoring and Evaluation Advisor will provide all the necessary technical and operational support required throughout the evaluation process.

Support from WV Somalia

WV Somalia will be responsible for the following:

  • Share all necessary documents to the Consultant to finalize the evaluation methodology and data collection tools
  • Provide input for evaluation study methodology, data collection tools and report.
  • Ensure that input from SDSC Consortium is circulated and shared with external Consultant
  • Flight expenses for the Consultant to Somaliland (where necessary)
  • Vehicle hire to support the evaluation exercise
  • Food and accommodation for the Consultant in Somaliland
  • Working space for the Consultant while in Somaliland
  • Recruitment and payment of enumerators
  • Stationery for data collection
  • Overall accountability of the evaluation process
  • Guidance and coordination throughout all the phases of evaluation, keeping communication with external Consultant throughout all phases
  • Provide support to the evaluation technical lead (external Consultant) for the evaluation field visits processes such as orientation and training of enumerators, FGDs and KIIs
  • Closely follow up the data collection process, ensuring quality control, daily debriefing, meeting the timelines set for interview completion;
  • Inform evaluation audience for their involvement in the study and help in setting specific dates for the evaluation field schedule.
  • Provide smartphones/tablets, Kobo Toolbox server for data collection where required.

The Consultant will be responsible for the following:

  • Review all relevant documents for evaluation study
  • Develop evaluation study design which includes survey methodology and the data collection tools (review the existing household questionnaire; focus group guides, interview protocol, data entry templates). Besides, prepare a field manual for training, in consultation with evaluation team, reflecting WV Somalia feedback on the methodology. These should be heavily based on the tools used at baseline, midterm to make appropriate comparisons over the life of the project
  • Designing the xml forms, data entry template, procedures and systems, and training of entry clerks in the use of the template,
  • Develop the field work schedule in consultation with evaluation team
  • Conduct training for enumerators during field visits phase, finalize the evaluation schedule
  • Supervise the data collection process, provide advice and ensure the quality of the data
  • Conduct interviews (KII) with the Consortium members and line ministries
  • Data analysis and report writing. It is expected that at least 2 drafts be provided to WV Somalia with feedback addressed in each round before submission of the final report
  • Provide required data that is complete and labelled in English (variables and values) for both the SPSS and Microsoft Excel file formats.
  • Provide final versions of data collection tools.
  • Provide daily field briefing to the DME Manager, SDSC DME Coordinator, SDSC Project Coordinator on the progress and any challenges from the field.

10. Limitations

Time and security may be a major limitation with regard to assessment processes in fragile and versatile contexts such as Somalia and this makes it often challenging to keep up strictly with a set agenda. In addition, In Somalia; households spent a better part of the afternoon hours in prayers and it will be hard for the enumerators to administer many questionnaires per day (in an effort to complete the assessment timely). To address this issue, firstly WV Somalia will allocate extra overflow days for field data collection. WV Somalia team will also work closely with the security department to ensure that the evaluation field processes are conducted in the most appropriate time and secure conditions. Therefore, the Consultant should be able to demonstrate some level of flexibility when required.

11. Documents

The key documents to be reviewed for the evaluation study are as follows:

  • Project document (needs assessment, proposal, log frame)
  • The Regional Durable Solutions Secretariat (ReDSS) framework
  • Baseline Report
  • Midterm evaluation report
  • Quarterly, semi-annual and annual and monitoring reports
  • Training reports
  • Success stories
  • Any district level secondary data and other relevant documents and reports.

12. Qualifications of the Consultant

We are looking for a Consultant with the following skills and qualifications;

  • The team leader MUST possess atleast a Master’s Degree in any of the following fields; International Development, Social Sciences, Statistics, Community Development, Development Studies, Local Government or any related field
  • Strong and documented experience in conducting participatory qualitative assessments related food security, livelihoods and water and sanitation programming.
  • Demonstrated experience in leading at least three similar project evaluation studies such as surveys and group interviews,
  • At least 10 years’ experience in conducting baseline and evaluations for complex projects such as livelihoods, education and protection, infrastructure development, health, water and sanitation and hygiene being implemented by non-governmental and private sector actors.
  • A solid understanding of remote learning and use of mobile technology in data collection,
  • Demonstrated experience in leading teams, training local staff in quantitative and qualitative data collection tools including entry template
  • Demonstrated experience in designing survey methodology, data collection tools, processing and analysis of data.
  • Ability to interact with host government, partners as requested by WV Somalia;
  • Strong organizational, analytical and reporting skills, presentation skills, attention to detail, ability to meet deadlines, and proficiency in SPSS or other statistical packages, Microsoft Office and qualitative data analysis software/tools.
  • Previous experience in a fragile country with tight security context will be preferred.
  • Capacity to use mobile data collection and GIS tools for data collection, and analysis of survey results.
  • Excellent verbal and written communication in English required.

How to apply:

13. Application Process and Requirements

Qualified and interested parties are asked to submit the following;

  1. Letter of interest in submission of a proposal
  2. A detailed technical proposal clearly demonstrating a thorough understanding of this ToR and including but not limited to the following;
  3. Consultant/Company Profile
  4. Description of the evaluation methodology as outlined in this ToR
  5. Demonstrated previous experience in similar assignments and qualifications outlined in this ToR (with submission of two most recent reports)
  6. Proposed data management plan (collection, processing and analysis)
  7. Proposed timeframe detailing activities and a work plan.
  8. Team composition and level of effort of each proposed team member (include CVs of each team member).
  9. A financial proposal with a detailed breakdown of costs for the study quoted in United States dollars.

All applications should be sent electronically to [email protected] with attachments in pdf and a subject line: “Technical and Financial Proposal for End of Project Evaluation-SDSC-Somaliland”

The top three shortlisted Consultants will be required to make an oral presentation of the technical proposal to Supply Chain and Core Project Evaluation Technical team to inform the final decision on the award of the contract.

The submission of technical and financial proposals closes on: 30th December 2019.

More Information

  • Job City Somalia
  • This job has expired!
Share this job


World Vision International is an Evangelical Christian humanitarian aid, development, and advocacy organization.

It was founded in 1950 by Robert Pierce as a service organization to meet the emergency needs of missionaries. In 1975 development work was added to World Vision's objectives.

It is active in more than 90 countries with a total revenue including grants, product and foreign donations of $2.79 billion (2011).

The World Vision Partnership is a global community of people passionately committed to improving the lives and futures of the world’s most vulnerable children.

We are one the world’s largest child focused development organisations, with over 45,000 staff in almost 100 countries, serving 100 million people annually. We work on every level to achieve our goal of child well-being – from international activism to checking in on children face-to-face.

Our people are our greatest asset. Each staff has unique experience and skills - and it’s our job to provide them with the training and opportunities they need to make their greatest contribution to our work worldwide.

According to our latest staff survey, over 80% of staff who responded are excited about the future, ready to put in extra effort, proud to work for World Vision and ready to recommend us to others as a great employer.

We offer a wide range of rewarding career opportunities, from tackling humanitarian emergencies, working in development and advocacy, to performing vital support roles such as finance, IT, marketing and human resources.

World Vision has the privilege to partner with communities in 25 countries in Africa: Angola, Burundi, Chad, Congo (DRC), Ethiopia, Ghana, Kenya, Lesotho, Malawi, Mali, Mauritania, Mozambique, Niger, Rwanda, Senegal, Sierra Leone, Somalia, South Africa, South Sudan, Sudan, Swaziland, Tanzania, Uganda, Zambia and Zimbabwe.

World Vision aims to achieve the sustained well-being of children within families and communities, especially the most vulnerable by ensuring that children:

  • Enjoy good health
  • Are educated for life
  • Experience love of God and their neighbour
  • Are cared for, protected and participate
Connect with us
0 USD Somalia CF 3201 Abc road Consultancy , 40 hours per week World Vision International

ENDLINE EVALUATION TERMS OF REFERENCE

(SOMALILAND DURABLE SOLUTIONS CONSORTIUM)

Abbreviations

SDSC Somaliland Durable Solution Consortium

WVS World Vision Somalia

NRC Norwegian Refugee Council

CWW Concern Worldwide

DRC Danish Refugee Council

DAC Displacement Affected Community

IDP Internally Displace Person

ToR Terms of Reference

RNG Random Number Generator

GIS Geographical Information Systems

DME Design Monitoring and Evaluation

FGD Focus Group Discussion

KII Key Informant Interviews

ReDSS Regional Durable Solutions Secretariat

1.0 Introduction

1.1 Evaluation Summary

Project name (s)

“Wadajir"-Enhancing Durable Solutions for and Reintegration of Displacement Affected Communities in Somaliland

Project goal

Create a conducive environment for displacement (or mixed migration) affected communities in Somalia to reach a durable solution

Project outcomes

Outcome 1: DACS are able to influence decisions, policies and agreements that affect them collectively as well as where to live and how they are governed.

Outcome 2: DACS have improved access and use of basic services/material safety as other non-displacement affected communities.

Outcome 3: DACS have improved access to adequate livelihoods through generating income and assets, gainful employment, and managing financial risk as other non-displacement affected communities.

Outcome 4: Learning on best practices and lessons on Durable Solutions disseminated by Somaliland Durable Solutions Consortium programming are utilized by actors and stakeholders working in Somaliland.

Target beneficiaries

63917 DAC members; 30,000 people affected by displacement (19,662 IDPs, 2,100 returnees, 8,238 members of host communities) in Hargeisa and Burao focusing on male and female youth, women and children in refugee camps and IDP settings as well as persons with special needs (e.g. elderly, other vulnerable groups).

Project locations

Burao: Ali Hussein, Adan Sulieman, Aqil yare and Koosar, and 7 health facilities in Burao District.

Hargeisa: Digaale, Statehouse, Jimcaale, Ayah 4 and Ayah 2, Ayah B1,

Project duration

March 2017-February 2020

Evaluation type

Endline Evaluation

Evaluation purpose

The purpose of the endline evaluation is to document and inform the key project stakeholders (donors, partners and beneficiaries) of the project progress with reference to the OECD/DAC evaluation criteria on relevance, effectiveness, efficiency, sustainability and impacts in relation to project results. The endline evaluation will also help to draw key lessons learnt and the best practices to the project stakeholders.

Methodology

The endline evaluation will adopt mixed methods design including quantitative and qualitative techniques as summarised below:

Quantitative

  • Beneficiary household surveys

Qualitative

  • Focus group discussion (FGD)
  • Key Informant Interviews (KII)
  • Document Reviews and Case studies

Expected Evaluation time: 15th January-28th February 2020

2. Description of Projects Being Evaluated

The Somaliland Durable Solutions Consortium (SDSC), a consortium of five agencies led by World Vision, including the Danish Refugee Council (DRC), Norwegian Refugee Council (NRC), Concern Worldwide (CWW) and Taakulo Somaliland Community (TASCO) has been implementing a 3 years “Wadajir" project in Hargeisa and Burao Districts of Somaliland. The project started in March 2017, and it will come to an end in February 2020. The goal of the project is to contribute to the creation of a conducive environment for communities in Somalia affected by displacement or mixed migration enabling them to reach durable solutions. To this end, the project works towards increased access to essential services and creation of realistic livelihood opportunities in the main access areas of return and departure in Somalia. The project is undertaken together with selected governmental line ministries (Ministry of Planning, Ministry of Education, Ministry of Resettlement, Rehabilitation and Reconstruction), District authorities, community leaders and all other relevant stakeholders.

The project will come to an end in February 2020; therefore, this Terms of Reference (ToR) has been prepared to undertake an endline evaluation of the project performance for the implementation period. The evaluation will be conducted by an external Consultant, through a participatory approach involving the partner organisations, line ministries and the beneficiary communities. The evaluation results will help the key stakeholders measure the level of project success with reference to service delivery to the project beneficiaries.

2.1 Project Goal

The main project goal is to create a conducive environment for displacement (or mixed migration) affected communities in Somalia to reach a durable solution.

Project Indicators:

Level

Indicator

Impact

Percentage of returnees received over the past 12 months and are willing to stay in place of origin, disaggregated by sex and age (+/-26).

Percentage of IDPs in the area of intervention integrated into the host community with equal access to resources, disaggregated by sex and age (+/-26).

Percentage of youth with intentions to stay in place of origin, disaggregated by sex and age (+/-26).

Percentage of people in the host community with a change in perception towards promoting co-existence, disaggregated by sex and age (+/-26).

Outcome 1: Displacement Affected Communities (DACs) are able to influence decisions, policies and agreements that affect them collectively as well as where to live and how they are governed.

§ Percentage of target population in community groups with the ability to address or voice their concerns and engage in advocacy

§ Number of effective and accessible mechanisms in place to ensure access to land and/or secure tenure (housing, land and property rights)

§ Percentage of DACs who believe that the government is responsive to their rights and needs.

Outcome 2: DACs have improved access and use of basic services/material safety as other non-displacement affected communities

§ Percentage of target population who are able to achieve an adequate standard of living.

§ Percentage of DACs with access to basic health care.

§ Percentage of target population that reports feeling safe in their community.

Outcome 3: DACs have improved access to adequate livelihoods through generating income and assets, gainful employment, and managing financial risk as other non-displacement affected communities.

§ Percentage of the unemployed among displaced compared to the resident population, disaggregated by sex and age (+/-26)

§ Percentage increase/decrease in mean income per month for displaced population by job type, disaggregated by sex and age

§ Percentage of target population having obtained a loan when needed (+/-26).

§ Percentage of households that report increased household income.

Outcome 4: Learning on best practices and lessons on Durable Solutions disseminated by SDSC programming are utilized by actors and stakeholders working in Somaliland

§ Number of learning recommendations that are incorporated by state or federal government policies after learning has been disseminated.

§ Number of learning recommendations that are incorporated by regional durable solutions institution policies after learning has been disseminated.

§ Number of learning recommendations that are incorporated by district government actions after learning has been disseminated.

Output level indicators

§ Number of community representation structures in place.

§ Number of Community Action Plans (CAPs) drafted and approved for implementation.

§ Number of effective and accessible mechanisms to address Housing Land and Property (HLP) disputes relevant to displacement.

§ Number of DACs target population with access to essential health care and protection services in comparison to the host community.

§ Number of health facilities with essential minimum health care services.

§ Number of healthcare workers (midwife, nurses, community health and female health workers) trained.

§ Number of Gender-Based Violence (GBV) service providers receiving training who demonstrate improved knowledge and attitudes in GBV practice.

§ Number of community outreach activities that include information about the locations and benefits of timely care for sexual assault survivors and other forms of GBV.

§ Number of police and judicial personnel in returnees /refugee areas trained on prevention of GBV and human rights violation (compared to national standards) effectively.

§ Number of Technical, Vocational Education and Training (TVET) centres established or rehabilitated (government or community).

§ Number of TVET centres with updated curriculum and enhance understanding of the market and linkages to employers.

§ Number of teachers and management staff in the TVET institutions trained.

3. Evaluation Target Audiences

The endline project evaluation is intended to benefit multiple stakeholders that have been involved directly or indirectly in the project implementation process. In particular, the following are the key stakeholders that will be involved in the evaluation process;

  • Project beneficiaries including IDP and host communities in Hargeisa and Burao Districts
  • Targeted health facilities in Burao District
  • Gender-Based Violence and Child Protection Committees
  • Ministry of Planning, Ministry of Education, Ministry of Resettlement, Rehabilitation and Reconstruction, Somaliland National Displacement and Refugee Agency
  • Consortium members including; World Vision, DRC, NRC, CWW and TAAKULO
  • The Regional Durable Solutions Secretariat (ReDSS)
  • World Vision Germany
  • European Union (EU).

4. Evaluation type

This is an end of project evaluation that is aimed at assessing the progress made by the project towards achieving the project goal of creating a conducive environment for displacement (or mixed migration) affected communities in Somalia to reach a durable solution. The assessment of the project impact will focus on the contribution made by the project from inception.

5. Evaluation Purpose and Objectives

The primary purpose of this evaluation is to assess the impact, appropriateness, effectiveness, efficiency and sustainability of the SDSC project. The project endline evaluation will also help to draw key lessons learnt and the best practices to the project stakeholders. In particular, the project evaluation will be shaped by the following specific key evaluation questions:

Objective: Key Evaluation Questions

Impact: What has been the impact of the project interventions on the community? Besides, establish the level of impact on the target beneficiaries.What contributions have made the projects interventions on the impact measured on the community and the target beneficiaries? What other factors and actors contributed?

Relevance

  • Community involvement and participation in the design process, goal setting, planning and implementation.
  • How equitably has the project benefited the; women, men, boys and girls, returnees, IDPs and refugees?
  • The relevance and appropriateness of project design to the needs of the community.Effectiveness
  • What are the achievements against set objectives?
  • Compare actual with planned outputs and how have outputs been translated into outcomes.
  • The evaluation shall also establish the possible deviation from planned outputs and likely outcomes.Efficiency
  • How adequate were the available resources qualitatively and quantitatively?
  • Were all the project resources utilised optimally?
  • Explore alternative low-cost approaches that could have been used to achieve similar results?
  • How could the efficiency of the project be improved without compromising outputs?
  • Assess the timeliness of implementing the project activities.
  • How adequate were the reporting and monitoring systems of the project?
  • Have the project outputs been achieved at a reasonable cost?Sustainability
  • Are there sustainability plans, structures and skills in place to ensure there is sustainability of project benefits? How adequate are they?
  • How is the community and local partners prepared to continue with the project outcomes?
  • How likely are the outcomes to be sustainable and enduring? In what ways will it leave a legacy for its beneficiaries and the communities?
  • In what ways are women and men in communities, the local partners and government stakeholder’s partners prepared to continue with the project outcome?

6. Evaluation Methodology

The evaluation methodology will be designed in alignment with World Vision’s Learning through Evaluation with Accountability and Planning (LEAP) guidelines and principles. To ensure the quality of evidence, the evaluation will be designed with reference to the Bond Evidence Principles Checklist. Specific reference will be made to the 5 key dimensions of voice and inclusion, appropriateness, triangulation, contribution and transparency.

The data collection process will apply both quantitative and qualitative methods. A detailed evaluation methodology will be designed by the external Consultant in consultation with WV Design Monitoring and Evaluation Manager, Consortium Project Coordinator and WV Germany Monitoring and Evaluation Advisor who will validate the sampling strategy and procedures.

The detailed design of methodology must include the following;

  • The evaluation design
  • Sampling for qualitative and quantitative surveys
  • Data collection instruments, protocols and procedures
  • Procedures for analysing quantitative and qualitative data
  • Data presentation/dissemination methods.
  • Report writing and sharing etc.
  • The endline evaluation should take into account the methodology of the baseline and midterm evaluation to ensure data comparability.
  • The key data collection methods will include the following, among others.
  • Document reviews including the project proposal, baseline report, quarterly and semi-annual reports, midterm evaluation report, monitoring reports and project review reports.
  • Focus Group Discussions (FGD) involving primary project participants and
  • Key Informant Interviews with the line ministries, district authorities, consortium members and community leaders, among others.
  • Quantitative survey
  • Reflection and feedback sessions with staff and partners.

The Consultant will be expected to employ mobile data collection using smartphones leveraged on Kobo toolbox. Also, Geographical Information System (GIS) solutions will also be employed in the evaluation process; ranging from data collection, analysis and presentation of results.

7. Evaluation Deliverables

The Consultant will be expected to deliver the following outputs:

  1. An inception report detailing the approach and methodology to be used and sample size calculations, a detailed execution plan, data-collection tools.
  2. Draft report submitted to WV Somalia within an agreed timeline between the WV Somalia and the Consultant
  3. Presentation of the key findings and recommendations to SDSC Consortium in Hargeisa.
  4. All indicators must be presented overall and disaggregated by sex and disability status, where appropriate.
  5. Collected data (raw) after analysis complete with variable labels and codes, and the final evaluation tools submitted to WV Somalia and alongside the final report.
  6. Final report (soft copy) and 3 hard copies submitted to WV Somalia Quality Assurance team and SDSC Project Coordinator.
  7. The Consultant should note that the Final Evaluation Report shall follow the structure below customized from the UNDP (2009) Handbook on Planning, Monitoring, and Evaluation for Development Results.
  8. The evaluation report will also be guided by Bond Evidence Principles Checklist. Specific reference will be made to the 5 key dimensions of voice and inclusion, appropriateness, triangulation, contribution and transparency.
  9. The Consultant will be required to prepare and submit an executive brief of the evaluation report with infographics summarising the key project achievements, recommendations, lessons learnt and the best practices.

7.1 Evaluation Report Structure

Title and Opening pages (front matter)-should provide the following basic information:

i. Name of the project evaluated

ii. Time frame of the evaluation and date of the report

iii. Project location (districts and country)

iv. SDSC consortium logo as well as partner organisations

v. Acknowledgments

Table of Contents-including boxes, figures, tables, and annexes with page references.

List of acronyms and abbreviations

Executive Summary

A stand-alone section of two to three pages that should:

  • Briefly describe the intervention (the project(s) that was evaluated.
  • Explain the purpose and objectives of the evaluation, including the audience for the evaluation and the intended uses
  • Describe key aspect of the evaluation approach and methods.
  • Summary of the key findings, conclusions, and recommendations.

Introduction

This section will;

  1. Provide a brief explanation of why the evaluation was conducted, why the intervention is being evaluated at this point in time, and why it addressed the questions it did.
  2. Identify the primary audience or users of the evaluation, what they wanted to learn from the evaluation and why and how they are expected to use the evaluation results.
  3. Identify the intervention (the project(s) that was evaluated
  4. Acquaint the reader with the structure and contents of the report and how the information contained in the report will meet the purposes of the mid-term evaluation and satisfy the information needs of the report’s intended users.

Description of the Intervention

This section will provide the basis for report users to understand the logic and assess the merits of the mid-term evaluation methodology and understand the applicability of the evaluation results. The description needs to provide sufficient detail for the report user to derive meaning from the evaluation. In particular, the section will;

  1. Describe what is being evaluated, who seeks to benefit, and the problem or issue it seeks to address.
  2. Explain the expected results map or results framework, implementation strategies, and the key assumptions underlying the strategy.
  3. Link the intervention to the durable solution framework
  4. Identify any significant changes (plans, strategies, logical frame-works) that have occurred overtime and explain the implications of those changes for the evaluation
  5. Identify and describe the key partners involved in the implementation and their roles.
  6. Describe the scale of the intervention, such as the number of components (e.g., phases of a project) and the size of the target population for each component.
  7. Indicate the total resources, including human resources and budgets.
  8. Describe the context of the social, political, economic, and institutional factors, and the geographical landscape within which the intervention operates and explain the effects (challenges and opportunities) those factors present for its implementation and outcomes.
  9. Point out design weaknesses (e.g., intervention logic) or other implementation constraints (e.g., resource limitations).Evaluation Scope and Objectives This section of the report will provide an explanation of the evaluation’s scope, primary objectives and main questions.
  10. Evaluation scope-define the parameters of the evaluation, for example, the time period, the segments of the target population included, the geographic area included, and which components, outputs or outcomes were and were not assessed.
  11. Evaluation objectives-spell out the types of decisions evaluation users will make, the issues they will need to consider in making those decisions, and what the evaluation will need to achieve to contribute to those decisions.
  12. Evaluation criteria-define the evaluation criteria or performance standards used. The report should explain the rationale for selecting the particular criteria used in the evaluation.
  13. Evaluation questions- the evaluation questions will define the information that the mid-term evaluation will generate. The report will detail the main evaluation questions addressed by the evaluation and explain how the answers to these questions address the information needs of users.

Evaluation Approach and Methods

This section will describe in detail the selected methodological approaches, methods and analysis; the rationale for their selection; and how, within the constraints of time and money, the approaches and methods employed yielded data that helped answer the evaluation questions and achieved the evaluation purposes. The description will help the report users judge the merits of the methods used in the mid-term evaluation and the credibility of the findings, conclusions and recommendations. The description of methodology will include discussion of each of the following:

  1. Data sources-sources of information (documents reviewed and stakeholders), the rationale for their selection and how the information obtained addressed the evaluation questions.
  2. Sample and sampling frame-the sample size and characteristics; the sample selection criteria, the process for selecting the sample (e.g. random, purposive); and the extent to which the sample is representative of the entire target population, including discussion of the limitations of the sample for generalizing results.
  3. Data collection procedures and instruments-methods or procedures used to collect data, including discussion of data collection instruments (e.g., interview protocols), their appropriateness for the data source and evidence of their reliability and validity.
  4. *Performance standards***-**standard or measure that will be used to evaluate performance relative to the evaluation questions (e.g., national or regional indicators, rating scales).
  5. Stakeholder engagement-stakeholders’ engagement in the evaluation and how the level of involvement contributed to the credibility of the evaluation and the results.
  6. Major limitations of the methodology-major limitations of the methodology shall be identified and openly discussed as to their implications for evaluation, as well as steps taken to mitigate those limitations.
  7. Data analysis-procedures used to analyse the data collected to answer the evaluation questions. This will detail the various steps and stages of analysis that will be carried out, including the steps to confirm the accuracy of data and the results. The report will discuss the appropriateness of the analysis to the evaluation questions. Potential weaknesses in the data analysis and gaps or limitations of the data should be discussed, including their possible influence on the way findings may be interpreted and conclusions drawn.

Findings and Conclusions

This section will present the evaluation findings based on the analysis and conclusions drawn from the findings. In particular,

Findings: This section will present findings as statements of fact that are based on analysis of the data. The evaluation findings will be structured around the evaluation criteria and questions so that report users can readily make the connection between what was asked and what was found. Variances between planned and actual results will be explained, as well as factors affecting the achievement of intended results. The assumptions or risks in the project design that subsequently affected implementation will also be discussed.

Conclusions: This section will be comprehensive and balanced and highlight the strengths, weaknesses and outcomes of the intervention. The conclusion section will be substantiated by the evidence and logically connected to the evaluation findings. The conclusion will also respond to key evaluation questions and provide insights into the identification of and/or solutions to important problems or issues pertinent to the decision-making.

Recommendations: The mid-term evaluation will seek to provide very practical, feasible recommendations directed to the intended users of the report about what actions to take or decisions to make. The recommendations will be specifically supported by the evidence and linked to the findings and conclusions around key questions addressed by the evaluation. This shall also address sustainability of the initiative and comment on the adequacy of the project exit strategy.

Lessons Learned

The report will include discussion of lessons learned from the evaluation, that is; new knowledge gained from the particular circumstances (intervention, context outcomes, even about the evaluation methods) that are applicable to a similar context. Concise lessons based on specific evidence presented in the report will be presented in the mid-term evaluation report.

Report Annexes

The Annex section will include the following to provide the report reader with supplemental background and methodological details that enhance the credibility of the report.

  • ToR for the evaluation
  • Additional methodology-related documentation, such as the evaluation matrix and data collection instruments (questionnaires, interview guides, observation protocols, etc.) as appropriate
  • List of individuals or groups interviewed or consulted and sites visited
  • List of supporting documents reviewed
  • Project results map or results framework
  • Summary tables of findings, such as tables displaying progress towards outputs, targets, and goals relative to established indicators.

8. Time frame

The overall evaluation process is expected to take 45 days including preparation, data collection, and analysis and reporting. The Consultant should be able to undertake some of the tasks concurrently to fit within the planned time-frame, without compromising the quality expected. The assignment is expected to commence on 15th January 2020, with the final evaluation report expected by 28th February 2020.

9. Authority and Responsibility

WV Somalia will establish an evaluation team to oversee all the related tasks. The DME Manager will be responsible for the overall coordination of all the evaluation tasks with the Consultant. In addition, Consortium DME Coordinator, Consortium Project Coordinator, Regional Operations Manager, Quality Assurance & Strategy Manager and WV Germany Monitoring and Evaluation Advisor will provide all the necessary technical and operational support required throughout the evaluation process.

Support from WV Somalia

WV Somalia will be responsible for the following:

  • Share all necessary documents to the Consultant to finalize the evaluation methodology and data collection tools
  • Provide input for evaluation study methodology, data collection tools and report.
  • Ensure that input from SDSC Consortium is circulated and shared with external Consultant
  • Flight expenses for the Consultant to Somaliland (where necessary)
  • Vehicle hire to support the evaluation exercise
  • Food and accommodation for the Consultant in Somaliland
  • Working space for the Consultant while in Somaliland
  • Recruitment and payment of enumerators
  • Stationery for data collection
  • Overall accountability of the evaluation process
  • Guidance and coordination throughout all the phases of evaluation, keeping communication with external Consultant throughout all phases
  • Provide support to the evaluation technical lead (external Consultant) for the evaluation field visits processes such as orientation and training of enumerators, FGDs and KIIs
  • Closely follow up the data collection process, ensuring quality control, daily debriefing, meeting the timelines set for interview completion;
  • Inform evaluation audience for their involvement in the study and help in setting specific dates for the evaluation field schedule.
  • Provide smartphones/tablets, Kobo Toolbox server for data collection where required.

The Consultant will be responsible for the following:

  • Review all relevant documents for evaluation study
  • Develop evaluation study design which includes survey methodology and the data collection tools (review the existing household questionnaire; focus group guides, interview protocol, data entry templates). Besides, prepare a field manual for training, in consultation with evaluation team, reflecting WV Somalia feedback on the methodology. These should be heavily based on the tools used at baseline, midterm to make appropriate comparisons over the life of the project
  • Designing the xml forms, data entry template, procedures and systems, and training of entry clerks in the use of the template,
  • Develop the field work schedule in consultation with evaluation team
  • Conduct training for enumerators during field visits phase, finalize the evaluation schedule
  • Supervise the data collection process, provide advice and ensure the quality of the data
  • Conduct interviews (KII) with the Consortium members and line ministries
  • Data analysis and report writing. It is expected that at least 2 drafts be provided to WV Somalia with feedback addressed in each round before submission of the final report
  • Provide required data that is complete and labelled in English (variables and values) for both the SPSS and Microsoft Excel file formats.
  • Provide final versions of data collection tools.
  • Provide daily field briefing to the DME Manager, SDSC DME Coordinator, SDSC Project Coordinator on the progress and any challenges from the field.

10. Limitations

Time and security may be a major limitation with regard to assessment processes in fragile and versatile contexts such as Somalia and this makes it often challenging to keep up strictly with a set agenda. In addition, In Somalia; households spent a better part of the afternoon hours in prayers and it will be hard for the enumerators to administer many questionnaires per day (in an effort to complete the assessment timely). To address this issue, firstly WV Somalia will allocate extra overflow days for field data collection. WV Somalia team will also work closely with the security department to ensure that the evaluation field processes are conducted in the most appropriate time and secure conditions. Therefore, the Consultant should be able to demonstrate some level of flexibility when required.

11. Documents

The key documents to be reviewed for the evaluation study are as follows:

  • Project document (needs assessment, proposal, log frame)
  • The Regional Durable Solutions Secretariat (ReDSS) framework
  • Baseline Report
  • Midterm evaluation report
  • Quarterly, semi-annual and annual and monitoring reports
  • Training reports
  • Success stories
  • Any district level secondary data and other relevant documents and reports.

12. Qualifications of the Consultant

We are looking for a Consultant with the following skills and qualifications;

  • The team leader MUST possess atleast a Master’s Degree in any of the following fields; International Development, Social Sciences, Statistics, Community Development, Development Studies, Local Government or any related field
  • Strong and documented experience in conducting participatory qualitative assessments related food security, livelihoods and water and sanitation programming.
  • Demonstrated experience in leading at least three similar project evaluation studies such as surveys and group interviews,
  • At least 10 years’ experience in conducting baseline and evaluations for complex projects such as livelihoods, education and protection, infrastructure development, health, water and sanitation and hygiene being implemented by non-governmental and private sector actors.
  • A solid understanding of remote learning and use of mobile technology in data collection,
  • Demonstrated experience in leading teams, training local staff in quantitative and qualitative data collection tools including entry template
  • Demonstrated experience in designing survey methodology, data collection tools, processing and analysis of data.
  • Ability to interact with host government, partners as requested by WV Somalia;
  • Strong organizational, analytical and reporting skills, presentation skills, attention to detail, ability to meet deadlines, and proficiency in SPSS or other statistical packages, Microsoft Office and qualitative data analysis software/tools.
  • Previous experience in a fragile country with tight security context will be preferred.
  • Capacity to use mobile data collection and GIS tools for data collection, and analysis of survey results.
  • Excellent verbal and written communication in English required.

How to apply:

13. Application Process and Requirements

Qualified and interested parties are asked to submit the following;

  1. Letter of interest in submission of a proposal
  2. A detailed technical proposal clearly demonstrating a thorough understanding of this ToR and including but not limited to the following;
  3. Consultant/Company Profile
  4. Description of the evaluation methodology as outlined in this ToR
  5. Demonstrated previous experience in similar assignments and qualifications outlined in this ToR (with submission of two most recent reports)
  6. Proposed data management plan (collection, processing and analysis)
  7. Proposed timeframe detailing activities and a work plan.
  8. Team composition and level of effort of each proposed team member (include CVs of each team member).
  9. A financial proposal with a detailed breakdown of costs for the study quoted in United States dollars.

All applications should be sent electronically to [email protected] with attachments in pdf and a subject line: "Technical and Financial Proposal for End of Project Evaluation-SDSC-Somaliland"

The top three shortlisted Consultants will be required to make an oral presentation of the technical proposal to Supply Chain and Core Project Evaluation Technical team to inform the final decision on the award of the contract.

The submission of technical and financial proposals closes on: 30th December 2019.

2019-12-31

NGO Jobs in Africa | NGO Jobs

Ngojobsinafrica.com is Africa’s largest Job site that focuses only on Non-Government Organization job Opportunities across Africa. We publish latest jobs and career information for Africans who intends to build a career in the NGO Sector. We ensure that we provide you with all Non-governmental Jobs in Africa on a consistent basis. We aggregate all NGO Jobs in Africa and ensure authenticity of all jobs available on our site. We are your one stop site for all NGO Jobs in Africa. Stay with us for authenticity & consistency.

Stay up to date

Subscribe for email updates

March 2024
MTWTFSS
« Jan  
 123
45678910
11121314151617
18192021222324
25262728293031
RSS Feed by country: