Data Engineer 162 views0 applications


Background/IRC Summary:

Technology and Operations support the organization’s work by providing reliable and scalable solutions for the IRC’s offices around the world. The Data Team at IRC is responsible for the design and delivery of global data strategies and the systems and products that deliver on it.

Job Overview/Summary:

The Data Engineer will support the implementation, configuration, and maintenance of data systems and pipelines across IRC’s data environment. This role assists in building and operating ETL/ELT processes, data integrations, and cloud-based data platforms such as Azure Databricks, Synapse, and Fabric.

The successful candidate will help maintain Lakehouse data environments by monitoring pipeline execution, supporting data loads, and assisting in data modeling tasks under guidance from senior team members. This is a hands-on technical role that requires foundational data engineering knowledge, willingness to learn, and strong collaboration skills.

The Data Engineer will work closely with senior engineers and architects but will not be responsible for deputizing for the Data Architect or owning critical security responsibilities.

Major Responsibilities:

1. Design, build, and maintain reliable ETL/ELT data pipelines for batch and near-real-time processing from internal and external sources using tools such as Azure Data Factory or Databricks workflows.

2. Implement data validation, testing, and reconciliation checks (including dbt tests where applicable)

  • Monitor pipeline health, performance, and reliability.
  • Identify issues and escalate or collaborate with senior engineers to resolve them.
  • Write SQL and Python queries for data extraction and transformation.
  • Support documentation of processes, standards, and improvements.
  • Support solution design by preparing data samples, documentation, or prototype queries.

Key Working Relationships:

  • Data Team
  • Business/Departmental Priority Setters
  • Enterprise Systems Owners

Position Reports to: Omar Bouidel

Travel Requirements:

  • Support solution design by preparing data samples, documentation, or prototype queries.

Minimum Requirements:

  • Experience: 2–4 years of hands-on experience in data engineering, data processing, or software engineering.
  • Technical Skills:
    • SQL (advanced): joins, window functions, CTEs, performance tuning
    • Python: data processing, APIs, automation, PySpark basics
    • Data modeling: star/snowflake schemas, fact & dimension tables
    • ETL/ELT pipelines: building, monitoring, and optimizing pipelines
    • dbt Core / dbt Cloud: developing, scheduling, and maintaining models
  • CI/CD Tools: Familiarity with Git or other version control systems.
  • Problem-Solving: Strong problem-solving skills and attention to detail.
  • Communication: Good communication and teamwork skills.

Preferred Additional Requirements

  • Experience with cloud platforms (Azure preferred), Azure Data Factory, or similar cloud data tools.
  • Support solution design by preparing data samples, documentation, or prototype queries.
  • Support solution design by preparing data samples, documentation, or prototype queries.

Working Environment:

  • Remote

More Information

  • Job City Kenya
  • This job has expired!
Share this job


The International Rescue Committee (IRC) responds to the world’s worst humanitarian crises and helps people to survive and rebuild their lives. Founded in 1933 at the request of Albert Einstein, the IRC offers lifesaving care and life-changing assistance to refugees forced to flee from war or disaster. At work today in over 40 countries and 22 U.S. cities, we restore safety, dignity and hope to millions who are uprooted and struggling to endure. The IRC leads the way from harm to home.

Since October 2012, the IRC has been responding to humanitarian needs of Nigerians. The IRC initially intervened in response to floods that affected over 7 million people across the country, destroying harvest and damaging homes. The IRC is currently implementing programs in Health, Protection, WASH, Nutrition, Food Security, and Women’s Protection and Empowerment (WPE) in Adamawa and Borno States in North-Eastern Nigeria.

The IRC is dedicated to making women and adolescent girls healthier from the earliest phase of acute crises (a target group most vulnerable during crisis) and implements evidence-based reproductive health interventions in line with the SPHERE-standard Minimum Initial Service Package for Reproductive Health in Crises (MISP). The goal is to ensure that the IRC’s health responses in emergencies include the core package of Reproductive Health (RH) services in its interventions.

The IRC’s Reproductive Health (RH) program is currently implementing (MISP) for RH in 4 health care centers in MMC and Jere LGAs and 1 IDP camp clinic. In addition the program is starting up an emergency mobile programming outside of these areas of Maiduguri in coordination with the WPE team. The focus of this program is to provide quality comprehensive RH and WPE services to conflict-affected women and girls in a timely manner. In addition to the mobile program, the WPE and RH joint mobile teams will be in charge of rapid assessments and rapid response. The mobile teams will be focused in the newly opened LGAs and emergency areas previously inaccessible due to conflict and insecurity. These teams will provide life-saving services to populations outside of Maiduguri, who have not had access to services in approximately 3 years.

Connect with us
0 USD Kenya CF 3201 Abc road Full Time , 40 hours per week International Rescue Committee Background/IRC Summary:Technology and Operations support the organization’s work by providing reliable and scalable solutions for the IRC’s offices around the world. The Data Team at IRC is responsible for the design and delivery of global data strategies and the systems and products that deliver on it.Job Overview/Summary:The Data Engineer will support the implementation, configuration, and maintenance of data systems and pipelines across IRC’s data environment. This role assists in building and operating ETL/ELT processes, data integrations, and cloud-based data platforms such as Azure Databricks, Synapse, and Fabric.The successful candidate will help maintain Lakehouse data environments by monitoring pipeline execution, supporting data loads, and assisting in data modeling tasks under guidance from senior team members. This is a hands-on technical role that requires foundational data engineering knowledge, willingness to learn, and strong collaboration skills.The Data Engineer will work closely with senior engineers and architects but will not be responsible for deputizing for the Data Architect or owning critical security responsibilities.Major Responsibilities:1. Design, build, and maintain reliable ETL/ELT data pipelines for batch and near-real-time processing from internal and external sources using tools such as Azure Data Factory or Databricks workflows.2. Implement data validation, testing, and reconciliation checks (including dbt tests where applicable)
  • Monitor pipeline health, performance, and reliability.
  • Identify issues and escalate or collaborate with senior engineers to resolve them.
  • Write SQL and Python queries for data extraction and transformation.
  • Support documentation of processes, standards, and improvements.
  • Support solution design by preparing data samples, documentation, or prototype queries.
Key Working Relationships:
  • Data Team
  • Business/Departmental Priority Setters
  • Enterprise Systems Owners
Position Reports to: Omar BouidelTravel Requirements:
  • Support solution design by preparing data samples, documentation, or prototype queries.
Minimum Requirements:
  • Experience: 2–4 years of hands-on experience in data engineering, data processing, or software engineering.
  • Technical Skills:
    • SQL (advanced): joins, window functions, CTEs, performance tuning
    • Python: data processing, APIs, automation, PySpark basics
    • Data modeling: star/snowflake schemas, fact & dimension tables
    • ETL/ELT pipelines: building, monitoring, and optimizing pipelines
    • dbt Core / dbt Cloud: developing, scheduling, and maintaining models
  • CI/CD Tools: Familiarity with Git or other version control systems.
  • Problem-Solving: Strong problem-solving skills and attention to detail.
  • Communication: Good communication and teamwork skills.
Preferred Additional Requirements
  • Experience with cloud platforms (Azure preferred), Azure Data Factory, or similar cloud data tools.
  • Support solution design by preparing data samples, documentation, or prototype queries.
  • Support solution design by preparing data samples, documentation, or prototype queries.
Working Environment:
  • Remote
2026-02-24

NGO Jobs in Africa | NGO Jobs

Ngojobsinafrica.com is Africa’s largest Job site that focuses only on Non-Government Organization job Opportunities across Africa. We publish latest jobs and career information for Africans who intends to build a career in the NGO Sector. We ensure that we provide you with all Non-governmental Jobs in Africa on a consistent basis. We aggregate all NGO Jobs in Africa and ensure authenticity of all jobs available on our site. We are your one stop site for all NGO Jobs in Africa. Stay with us for authenticity & consistency.

Stay up to date

Subscribe for email updates

March 2026
MTWTFSS
« Jan  
 1
2345678
9101112131415
16171819202122
23242526272829
3031 
RSS Feed by country: