**Senior Data Engineer** **Responsibilities** - Design, build and maintain scalable and efficient data architectures, including databases, data warehouses, data lakes, data marts, and other storage solutions. - Design, build, and maintain robust data pipelines for the extraction, transformation, and loading (ETL) of data from various sources to storage and analysis destinations. - Develop and implement data models that support the organization’s analytical and reporting needs, ensuring data integrity and optimal performance. - Integrate data from different sources, such as databases, APIs, and external systems, to create a unified and consistent view for analysis - Implement data quality management by implementing processes and tools to ensure the quality and reliability of data being collected and stored - Optimizing data infrastructure and processes for performance and scalability, ensuring timely and efficient data retrieval and processing, and handle growing amounts of data - Implementation of security measures to protect sensitive data and ensure compliance with data privacy regulations - Work closely with data scientist, analyst, and other stakeholder to understand data requirements to collect and prepare data for analysis, and provide the necessary infrastructure - Document data engineering processes, data flows, and system architecture to ensure transparency and ease of maintenance - Experience defining, building, and maintaining KPI, analytics, reporting and dashboards **Technical Requirements** - Bachelor’s degree in computer science, Engineering, or related field - 5+ years of experience as a data engineer building out analytic systems and solutions - Proficient in SQL and database systems (e.g. PostgreSQL, MySQL, SQL Server) - Experience working with Snowflake cloud data warehouse - Experience with Kafka (Pub/Sub Arch Patterns) - Experience with ETL tools and data integration processes - Familiar with big data technologies (e.g. Apache Spark, Hadoop) - Strong programming skills (e.g. Python, Java) for scripting and automation - Experience with cloud platforms (e.g. AWS, Azure, GCP) and their data services - Prior experience working with large datasets - Excellent problem-solving and communication skills **Bonus Skills** - Experience working with Tableau, Salesforce, Google Analytics, MixPanel, or other modern analytic platforms - Experience with JSON data formats - Kotlin/Micronaut Backend Experience