Press Tab to Move to Skip to Content Link Select how often (in days) to receive an alert: Thanks for your interest in ScotiaTech, Scotiabank's new and innovative Technology hub in Bogota. Join a purpose driven winning team that promotes creativity and innovation in a fast-paced environment, where we're always committed to results, in an inclusive, diverse, and high-performing culture. Purpose The DataOps Engineer is responsible for building, maintaining, and optimizing scalable and reliable data pipelines and infrastructure in a cloud environment, under the guidance of the Senior DataOps Engineer. This role collaborates with a global team of DataOps Engineers and cross-functional stakeholders to ensure high-quality data products and services that support the organization's data-driven goals. By implementing DataOps best practices and leveraging cloud technologies, the DataOps Engineer contributes to efficient data workflows, operational reliability, and seamless delivery across multiple countries. Accountabilities • Data Pipeline Development and Maintenance: - Assist in the design, development, and maintenance of cloud-based data pipelines using platforms such as Azure, GCP, or AWS (e.g., Dataflow, BigQuery, GCS, S3, Lambda functions, Cloud Functions, Composer), following established best practices and under the guidance of senior engineers. - Implement and support automated data workflows, including ETL processes, data quality checks, and orchestration using tools like Python, Spark, Airflow, or Cloud Composer. - Support efforts to meet performance and reliability targets as defined by senior team members. - Support the deployment and management of cloud data infrastructure components, ensuring scalability, security, and cost efficiency as directed by the Senior DataOps Engineer or other senior staff. - Apply infrastructure-as-code (IaC) practices using tools like Terraform or CloudFormation to provision and manage resources. - Monitor and assist in optimizing cloud resource usage (e.g., storage, compute) using tools like Azure Cost Management or GCP Billing. ent, or GCP Billing. • Automation and CI/CD Implementation: - Build and maintain CI/CD pipelines for data workflows using tools such as Jenkins, GitHub Actions, Cloud Build and Jfrog to enable rapid and reliable deployments. - Automate repetitive tasks, such as data validation and monitoring, to improve operational efficiency. • Collaboration and Global Teamwork: - Work closely with Data Engineers, Data Scientists, and business teams to deliver data products that meet business requirements. - Collaborate with global DataOps team members across multiple time zones, contributing to shared goals and knowledge sharing. - Provide regular updates on task progress and escalate issues to the Senior DataOps Engineer as needed. • Monitoring and Incident Response: - Monitor data pipelines and infrastructure using tools like Cloud Monitoring, Prometheus, or Cloud Logging to ensure operational health. - Participate in incident response and troubleshooting, conducting root cause analysis to resolve issues within defined SLAs (e.g.,