Senior Data Platform Engineer
We, at Grid Dynamics, are seeking a Senior Platform Data Engineer to join our team of experts. This role focuses on developing and maintaining a scalable data platform using cutting-edge technologies to meet the client’s evolving needs. The ideal candidate is a proactive problem-solver, passionate about working with complex data systems, and enjoys collaborating in an innovative and supportive environment.
About the Project:
Join our team working with the largest pan-European online car marketplace with over 1.5 million listings and 43,000 car dealer partners. Our client provides inspiring solutions and services that empower customers and deliver real value. As part of this dynamic project, you’ll play a key role in shaping and optimizing their data platform, leveraging modern tools and methodologies.
Responsibilities:
Core Data Platform Development:
Develop and maintain scalable data pipelines and integrations to manage increasing data volume and complexity.
Design and implement data contracts to streamline communication and dependencies between teams.
Build pipelines from scratch and on templates, utilizing modern tools and techniques.
Collaboration & Quality:
Work with analytics and business teams to improve data models feeding business intelligence tools, fostering data-driven decision-making.
Implement and monitor systems ensuring data quality, governance, and accuracy for all production data.
Data Infrastructure Management:
Manage and enhance the data platform, incorporating technologies like Airflow, Glue Jobs, and data mesh principles.
Design data integrations and establish a data quality framework.
Define company data assets, document transformations, and maintain engineering wikis.
Operations & Compliance:
Collaborate with engineering, product, and analytics teams to develop and maintain strategies for long-term data platform architecture.
Troubleshoot and resolve data-related issues in production environments.
Tech Stack:
Cloud Technologies: AWS (Athena, Glue, EMR, Firehose, etc.), Azure, GCP.
Data Tools: Airflow, Hadoop, Spark, Trino, Kafka.
Programming Languages: Python, SQL.
Additional Tools: DataStage, Jenkins, Git, Linux/AIX/z/OS.
Qualifications:
Must have
Expertise in AWS services, especially Glue, Athena, MWAA
Proficiency in Python and SQL
Experience with streaming platforms (Kafka or Firehose)
Experience with third-party solutions and APIs
Nice to have
Proficiency in data modelling techniques and best practices
Experience in implementing data contracts
Experience in applying data governance policies
Experience with data quality frameworks (Great expectations, Soda)
Familiarity with the data mesh architecture and its principles
Required languages
English | B2 - Upper Intermediate |