Apply now »

Data Engineer - Snowflake & Informatica IICS

Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you’d like, where you’ll be supported and inspired by a collaborative community of colleagues around the world, and where you’ll be able to reimagine what’s possible. Join us and help the world’s leading organizations unlock the value of technology and build a more sustainable, more inclusive world. 

Role Overview

We are seeking a highly skilled Data Engineer to design, build, and maintain scalable data pipelines and ETL processes that enable seamless data integration and analytics. The ideal candidate will have strong expertise in SQL, Python, and modern data platforms, with a focus on Snowflake and Informatica IICS. This role requires close collaboration with analysts, data scientists, and business teams to ensure data quality, governance, and accessibility.

Your Role

  • Design and Develop Data Pipelines: Build robust, scalable ETL processes to ingest and transform data from multiple sources.
  • Data Integration: Integrate structured and unstructured data into Snowflake using Informatica IICS and other tools.
  • Reusable Components: Develop reusable Python and SQL scripts for data transformation and automation.
  • Collaboration: Work closely with analysts, data scientists, and business stakeholders to understand requirements and deliver solutions.
  • Data Quality & Governance: Implement data validation, lineage tracking, and compliance with governance standards.
  • Performance Optimization: Tune SQL queries and pipeline performance for efficiency and scalability.
  • Automation & Orchestration: Manage workflows using orchestration tools like Airflow and support CI/CD deployments.
  • Documentation: Maintain clear technical documentation for data processes and pipelines.

Your Profile

  • Bachelor's Degree in Computer Science or similar.
  • Strong proficiency in SQL and Python for data engineering tasks.
  • Hands-on experience with Snowflake and Informatica IICS.
  • Familiarity with data modeling, orchestration tools (e.g., Airflow), and CI/CD pipelines.
  • Knowledge of cloud data platforms (AWS, Azure, or GCP) and API integrations.
  • Excellent problem-solving, analytical, and communication skills.

Typical Daily Activities

  • Monitor and troubleshoot ETL jobs and data pipelines.
  • Develop and optimize Python scripts and SQL queries.
  • Validate data quality and update lineage documentation.
  • Collaborate with cross-functional teams in daily stand-ups.
  • Schedule workflows and manage deployments via CI/CD.
  • Perform performance tuning and security checks in Snowflake.
  • Update documentation and share pipeline health reports.

Capgemini is an AI-powered global business and technology transformation partner, delivering tangible business value. We imagine the future of organizations and make it real with AI, technology and people. With our strong heritage of nearly 60 years, we are a responsible and diverse group of 420,000 team members in more than 50 countries. We deliver end-to-end services and solutions with our deep industry expertise and strong partner ecosystem, leveraging our capabilities across strategy, technology, design, engineering and business operations. The Group reported 2024 global revenues of €22.1 billion.
Make it real | www.capgemini.com

Ref. code:  379794
Posted on:  Dec 15, 2025
Experience Level:  Experienced Professionals
Contract Type:  Permanent
Location: 

Cairo, EG

Brand:  Capgemini
Professional Community:  Data & AI

Apply now »