Empower Your Career at Capgemini Join a collaborative community of colleagues around the world where you'll be inspired and supported to shape your career in the way you'd like. Responsibilities We're looking for an expert in data engineering who can: - Identify and troubleshoot root causes of failures in pipelines built with Azure Data Factory. - Review team monitoring reports to identify issues and create corresponding User Stories / PBIs for resolution. - Monitor the status of Databricks jobs and ADF pipelines. - Understand the concepts of ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) processes. You'll work closely with Azure Cloud Services, Azure Data Factory, and Python / SPARK (Azure Databricks). Qualifications To succeed in this role, you'll need: - A Bachelor's degree in Computer Science, Engineering, Information Systems, or related field. - 3+ years of experience in data engineering or a related role. - Proficiency in Python. - A strong understanding of ETL/ELT processes. - Experience with Azure Cloud Services, especially Azure Data Factory. - The ability to review monitoring reports and create PBIs for necessary fixes. - A focus on system maintenance. Preferred Qualifications If you have these skills, it will be a bonus: - Azure certifications (e.g., Azure Data Engineer Associate). - Experience with Databricks, Power BI, or Azure DevOps. - Experience working in Agile/Scrum environments. At Capgemini, you'll become part of something bigger than yourself. You'll be empowered to grow and learn in a collaborative community of colleagues around the world.