Company Description At Publicis Groupe, we are looking for a Senior Data Architect fluent in English to join Publicis Global Delivery, the outstanding platform that we created to become a global interconnected network and provide offshore & nearshore solutions for our partners - sister companies´ business worldwide. We are a never sleeps machine of creation that continuously grows and mutates to become a more efficient and collaborative system. A cross-media transformation agent, based in Argentina, Colombia, Costa Rica and Mexico, that provides centralized expertise of all Publicis Global Services' capabilities to enable consistent and standardized delivery across Media, Production, Commerce, Content, Data & Technology. **Job Description**: **What you’ll do**: We are offering the amazing opportunity to take part of a global distributed team that has a knack for understanding data and telling the story of why it matters. In our (global) projects you take on the expert role for any questions about data pipelines and evaluate or transfer requirements for an interface into a technical concept. You will work closely with our respective specialist teams and will be largely responsible for the further development of our data landscape. You are mainly responsible for: - Lead design of our cloud data platform. - Advanced design & architecture of our data lake. - Optimization / further development of existing and new data pipeline processes. - Analysis of all data interfaces and transfer to the overall architecture of the data platform. - Handover of operation and (partial) development topics to the offshore team. - Establishing and guaranteeing standards, quality, stability and robustness along the entire process chain. **Qualifications**: **We’re looking for strong, impactful work experience, which typically includes**: - You have expert know-how in the area of Python, (Azure) Databricks & DeltaLake. - You have a very good understanding of data and have in-depth knowledge of data modeling and the high-performance and efficient processing of these. - You are familiar with the most varied of modeling approaches of data lake, data lakehouse and data warehouse and you can assign requirements accordingly. - You are not new to handling csv, json, avro, parquet file formats. - You ideally have experience in the use and configuration of Spark as well as working with Spark Clusters. - Knowledge of common DWH solutions (Synapse, Snowflake, Exasol) as well as experience in the Azure environment (Logical Apps, Blob Storages, Data Factories...) are desirable. - You have a high sense of responsibility, independence and initiative with a 'hands-on' mentality. - You have a high technical affinity and a strong analytical mindset. - You are open to innovative topics, bring curiosity and willingness to learn. - You communicate fluent in English. Additional Information **Benefits**: - Access to a high quality Prepaid Medical Plan. - 100% Remote work. - Flexible schedule. - Technical trainings & soft skills development. - Certification programs. - Access to E-Learning platforms. - English Lessons. - Level up program. - Engagement activities and events. - A mentor who´ll do a coaching process with you to develop your professional career! All your information will be kept confidential!