Join our remote team as a Senior Data Software Engineer within a leading provider of AI-based solutions for industries. We are seeking a hands-on developer with deep expertise in building data pipelines, deploying to production, and working within a cloud platform. This role offers an opportunity to contribute significantly to the design, development, and optimization of features in a dynamic Agile development environment. Responsibilities Design and implement data pipelines using the Agile development process (Scrum) Ensure high-quality standards at every stage of development Guarantee reliability, availability, performance, and scalability of data pipelines Collaborate with Developers, Product and Program Management, and senior technical staff to deliver customer-centric solutions. Provide technical input for new feature requirements, partnering with business owners and architects Ensure continuous improvement by staying abreast of industry trends and emerging technologies Drive the implementation of solutions aligned with business objectives. Mentor and guide less experienced team members, helping them enhance their skills and grow their careers Participate in code reviews, ensuring code quality and adherence to standards Collaborate with cross-functional teams to achieve project goals Actively contribute to architectural and technical discussions Requirements At least 3 years of production experience in Data Software Engineering Be hands-on with deep expertise in cloud-based platforms, with a strong preference for Microsoft Azure Deep expertise in one or more of the following languages: Python, Spark, PySpark, SQL, with knowledge on how to build both within dev and enabling deployment to production Experience building out robust data pipelines Experience using Databricks for data engineering and processing Experience using Azure DevOps, GitHub, or other version control tools Experience with developing end-to-end production solutions Ability to tie loose ends together for solutions across systems Excellent communication skills in spoken and written English, at an upper-intermediate level or higher Nice to have Experience with big data technologies such as Hadoop, Hive, and Kafka Experience with Azure Data Factory or other ETL tools We offer/Benefits - International projects with top brands - Work with global teams of highly skilled, diverse peers - Healthcare benefits - Employee financial programs - Paid time off and sick leave - Upskilling, reskilling and certification courses - Unlimited access to the LinkedIn Learning library and 22,000+ courses - Global career opportunities - Volunteer and community involvement opportunities - EPAM Employee Groups - Award-winning culture recognized by Glassdoor, Newsweek and LinkedIn