Apply now »

Databricks Data Engineer

Who You'll Be Working With

 

As a Databricks Data Engineer at Capgemini, you will design, build and operate reliable, scalable data pipelines and lakehouse solutions that enable analytics, AI and GenAI. You will work hands-on with Databricks, Apache Spark and the Azure data ecosystem to ingest, transform and serve trusted data products, taking ownership from development through to production support and continuous improvement.


You will be part of the Data Platforms team  that is part of the Insights and Data Global Practice that has seen strong growth and continued success across a variety of projects and sectors.  Data Platforms is the home of the Data Engineers, Platform Engineers, Solutions Architects and Business Analysts who are focused on driving our customers digital and data transformation journey using the modern cloud platforms. We specialise on using the latest frameworks, reference architectures and technologies using AWS, Azure and GCP along with various data platforms like Databricks and Snowflake

 

Please Note: Security Clearance: To be successfully appointed to this role, must be eligible to obtain Security Check (SC)clearance. 
To obtain SC clearance, the successful applicant must have resided continuously within the United Kingdom for the last 5 years, along with other criteria and requirements.

Throughout the recruitment process, you will be asked questions about your security clearance eligibility such as, but not limited to, country of residence and nationality.
Some posts are restricted to sole UK Nationals for security reasons; therefore, you may be asked about your citizenship in the application process.

The Focus Of Your Role

 

As a Databricks Data Engineer with an Azure and Databricks focus, you will be an integral part of our team dedicated to building scalable and secure data platforms. You will leverage Databricks, Apache Spark and Azure services to develop and optimise batch and streaming pipelines, implement Delta Lake lakehouse patterns, and deliver well-governed data sets that power reporting, analytics and AI/ML use cases.

 

  • Build and maintain data pipelines and lakehouse solutions: Use Databricks and Apache Spark to ingest, transform and curate data in Azure Data Lake Storage (ADLS) and Delta Lake.
  • Implement data modelling, quality and governance: Develop scalable data models, apply validation/quality checks, and follow governance practices to ensure reliable and auditable data products.
  • Enable AI/ML and analytics use cases: Prepare curated datasets and features, collaborate with data scientists, and integrate pipelines with ML workflows (e.g., MLflow) where required.
  • Monitor and optimise jobs and clusters: Tune Spark performance, improve reliability, manage costs, and implement observability (logging, alerting, SLAs) for production workloads.
  • Collaborate across teams: Work with business analysts, platform engineers, data scientists and DevOps to deliver secure, well-tested data solutions in an agile environment.
  • Apply engineering best practices: Use version control, code review, automated testing and CI/CD; keep current with Databricks capabilities and data engineering patterns.
  • Be a Databricks advocate: Share knowledge, contribute to accelerators and standards, and pursue Databricks certification/champion pathways.

 

What You'll Bring

 

You will bring strong, hands-on experience delivering modern data engineering solutions within complex environments, with an understanding of regulatory obligations and the need for trusted, mission critical data. You will be able to build secure, scalable and well governed pipelines and lakehouse data products that support analytics, AI and GenAI while meeting requirements for data protection, sovereignty, transparency and auditability. 


You will be comfortable collaborating with stakeholders to translate outcomes into robust data solutions, and you will bring strong engineering discipline, documentation and teamwork skills to help teams deliver sustainable data capabilities.


Experience:

 

  • Minimum 5+ years of experience as a Data Engineer, including hands-on delivery of Databricks solutions in production environments.
  • Strong expertise in Databricks, Apache Spark and Delta Lake, with good understanding of lakehouse and data warehousing concepts.
  • Experience with Microsoft Azure, including ADLS Gen2, Azure Databricks, and orchestration tooling such as Azure Data Factory (or similar).
  • Proficiency in Python and SQL, with strong software engineering practices (Git, code review, unit testing, CI/CD) and an ability to troubleshoot production issues.
  • A continuous learning mindset, ideally with progress toward Databricks certification (e.g., Data Engineer Associate/Professional) or equivalent experience.
  • Relevant certifications (desirable): Databricks Data Engineer and/or Microsoft Azure data certifications.

Additional Info

 

Hybrid working: The places that you work from day to day will vary according to your role, your needs, and those of the business; it will be a blend of Company offices, client sites, and your home; noting that you will be unable to work at home 100% of the time.

If you are successfully offered this position, you will go through a series of pre-employment checks, including identity, nationality (single or dual) or immigration status, employment history going back 3 continuous years, and unspent criminal record check (known as Disclosure and Barring Service)

 

What we’ll offer you 

 

You will be encouraged to have a positive work-life balance.  Our hybrid-first way of working means we embed hybrid working in all that we do and make flexible working arrangements the day-to-day reality for our people.  All UK employees are eligible to request flexible working arrangements. 

You will be empowered to explore, innovate, and progress. You will benefit from Capgemini’s ‘learning for life’ mindset, meaning you will have countless training and development opportunities from thinktanks to hackathons, and access to 250,000 courses with numerous external certifications from AWS, Microsoft, Harvard Manage Mentor, Cybersecurity qualifications and much more.


Why we’re different 

At Capgemini, we help organisations across the world become more agile, more competitive, and more successful. Smart, tailored, often ground-breaking technical solutions to complex problems are the norm. But so, too, is a culture that’s as collaborative as it is forward thinking. Working closely with each other, and with our clients, we get under the skin of businesses and to the heart of their goals. You will too.
 
Capgemini is proud to represent nearly 130 nationalities and its cultural diversity. Our holistic definition of diversity extends beyond gender, gender identity, sexual orientation, disability, ethnicity, race, age, and religion. Capgemini views diversity as everything that makes us who we are as an organization, including our social background, our experiences in life and work, our communication styles and even our personality. These dimensions contribute to the type of diversity we value the most: diversity of thought.

Ref. code:  467225
Posted on:  28 Apr 2026
Experience Level:  Experienced Professionals
Contract Type:  Permanent
Location: 

London, GB Birmingham, GB Manchester, GB Bristol, GB Newcastle upon Tyne, GB

Brand:  Capgemini
Professional Community:  Architecture

Apply now »