Senior Data Engineer (Python, AWS) to $8000

Who we are:
Adaptiq is a technology hub specializing in building, scaling, and supporting R&D teams for high-end, fast-growing product companies in a wide range of industries. 

 

About the Product:
Our client is a leading SaaS company offering pricing optimization solutions for e-commerce businesses. Its advanced technology utilizes big data, machine learning, and AI to assist customers in optimizing their pricing strategies and maximizing their profits.

 

About the Role:
As a Senior Data Engineer, you will operate at the intersection of data engineering, software engineering, and system architecture. This is a high-impact, cross-functional role where you’ll take end-to-end ownership – from designing scalable infrastructure and writing robust, production-ready code to ensuring the reliability and performance of our systems in production. 

 

Key Responsibilities:

  • Build and maintain ETL/ELT pipelines from APIs, Kafka, and databases.
  • Design and manage Airflow DAGs that are modular and observable.
  • Optimize our data lake architecture on S3 and Athena for performance and scalability.
  • Develop and support real-time and event-driven data flows using Kafka and Spark.
  • Implement monitoring, data validation, and alerting to ensure pipeline reliability.
  • Expose clean and structured datasets for internal consumption via Athena or APIs.
  • Collaborate with DevOps and the architect to evolve data infrastructure.

Required Competence and Skills:

  • 5+ years of experience as a data engineer, software engineer, or similar role, with a proven track record of using data to drive business outcomes. 
  • Strong Python skills, with experience building modular, testable, and production-ready code. 
  • AWS Certified Data Analytics – Specialty or AWS Certified Big Data – Specialty certification (current or expired) 
  • Solid understanding of Databases and SQL, ETL/ELT design, and distributed data processing.
  • Experience with Airflow, Kafka, S3, Athena, Glue, and CI/CD practices.
  • Excellent communication and collaboration skills, proactive approach.

Nice-to-Haves

  • Experience with streaming technologies (Flink, Spark Streaming).
  • Experience building internal tools, APIs, or SDKs.
Published 2 September
25 views
·
4 applications
100% read
·
75% responded
Last responded yesterday
To apply for this and other jobs on Djinni login or signup.
Loading...