Job Description We are seeking an experienced Data Engineer to design, develop, and maintain robust ELT pipelines that ingest data from multiple systems into Snowflake. The ideal candidate will be able to model raw data into clean, reliable, and analytics-ready datasets for internal teams and stakeholders. The successful candidate will have a solid foundation in SQL, data modeling, and ELT orchestration tools, as well as experience working with cloud infrastructure and modern data stack technologies. Required Skills and Qualifications - Conversational to professional English - 3+ years of experience in data engineering or backend engineering with a focus on data workflows - Strong SQL skills, particularly in cloud data warehouses like Snowflake or BigQuery - Hands-on experience designing and maintaining ELT pipelines (dbt, Airflow, custom scripts) - Familiarity with modern data modeling best practices (e.g., dimensional modeling, star schemas) - Proficiency in a scripting language such as Python for data transformations and automation - Experience working with cloud environments (Azure preferred; AWS or GCP acceptable) Benefits - Opportunity to work on complex data projects - Collaborative team environment - Professional development opportunities Others - Exposure to CI/CD tools for data infrastructure (e.g., GitHub Actions, Azure DevOps) - Experience with monitoring/logging tools for data pipelines