Senior Data Engineer (Python, AWS) to $8000

Who we are:

 

Adaptiq is a technology hub specializing in building, scaling, and supporting R&D teams for high-end, fast-growing product companies in a wide range of industries. 

 

About the Product: 
Our client is a leading SaaS company offering pricing optimization solutions for e-commerce businesses. Its advanced technology utilizes big data, machine learning, and AI to assist customers in optimizing their pricing strategies and maximizing their profits.

 

About the Role: 
As a Senior Data Engineer, you will operate at the intersection of data engineering, software engineering, and system architecture. This is a high-impact, cross-functional role where you’ll take end-to-end ownership - from designing scalable infrastructure and writing robust, production-ready code to ensuring the reliability and performance of our systems in production. 

 

Key Responsibilities: 

  • Build and maintain ETL/ELT pipelines from APIs, Kafka, and databases.
  • Design and manage Airflow DAGs that are modular and observable.
  • Optimize our data lake architecture on S3 and Athena for performance and scalability.
  • Develop and support real-time and event-driven data flows using Kafka and Spark.
  • Implement monitoring, data validation, and alerting to ensure pipeline reliability.
  • Expose clean and structured datasets for internal consumption via Athena or APIs.
  • Collaborate with DevOps and the architect to evolve data infrastructure.

     

Required Competence and Skills:

  • 5+ years of experience as a data engineer, software engineer, or similar role, with a proven track record of using data to drive business outcomes. 
  • Strong Python skills, with experience building modular, testable, and production-ready code. 
  • Solid understanding of Databases and SQL, ETL/ELT design, and distributed data processing.
  • Experience with Airflow, Kafka, S3, Athena, Glue, and CI/CD practices.
  • Excellent communication and collaboration skills, proactive approach.

     

Nice-to-Haves

  • Experience with streaming technologies (Flink, Spark Streaming).
  • Experience building internal tools, APIs, or SDKs.

     

Why Us?

We provide 20 days of vacation leave per calendar year (plus official national holidays of a country you are based in).

 

We provide full accounting and legal support in all countries we operate.

 

We utilize a fully remote work model with a powerful workstation and co-working space in case you need it.

 

We offer a highly competitive package with yearly performance and compensation reviews.



 

 

 

 

 

Required languages

English B2 - Upper Intermediate
AWS, Python, Airflow, Kafka
Published 9 September
29 views
·
3 applications
To apply for this and other jobs on Djinni login or signup.
Loading...