**What you will do** - Collect, transform and publish data to be used for insights - Designing, building, and operationalizing data processing systems and pipelines - Ensure data quality and efficiency - Design and maintain database systems - Integrate distributed systems into a single source of truth - Analyzing raw data - Transform different forms of data into a usable format - Operationalizing machine learning models - Conducting systems monitoring across cloud infrastructures - Automating processes for installation, configuration, monitoring - Identifying, creating, preparing data required for modern BI solutions - Creating and documenting the tests to meet requirements **Must haves** - Bachelor’s or Master’s degree in computer science, engineering, mathematics or a similar analytical field. - 3+ years of experience in a data engineering and cloud engineering (AWS, Azure) - Strong programming skills in at least one programming language like Java, Python, or Scala - Good knowledge of relational databases or NoSQL databases like MongoDB, DynamoDB - Technical expertise with data models, data mining, and segmentation techniques - Good understanding of data lakes and data warehousing - Good understanding of ETL tools like AWS Glue, AWS Data Pipeline - Interest or experience in Big Data technologies (Hadoop, Spark, Data Bricks, Snowflake) - Open mindset, ability to quickly adapt new technologies and learn new practice **Nice to haves** - AWS Certified Solutions Architect - preferred, not required - Interest or experience with machine learning principles - ]Experience with real-time analytics and data streaming and experience with analytics tools like Elastic Map Reduce, Elasticearch, Kinesis