Dataforest. Empover the data.

Senior Data Engineer

Dataforest is looking for a Senior Data Engineer to join our team and work on the Dropship project β€” a cutting-edge data intelligence platform for e-commerce analytics.
You will be responsible for developing and maintaining a scalable data architecture that powers large-scale data collection, processing, analysis, and integrations.

If you are passionate about data optimization, system performance, and architecture, we’re waiting for your CV!

Requirements:
β€’ 4+ years of commercial experience with Python;
β€’ Advanced experience with SQL DBs (optimisations, monitoring, etc.);
β€’ PostgreSQL β€” must have;
β€’ Solid understanding of ETL principles (architecture/ monitoring/ alerting/search and resolve bottlenecks);
β€’ Experience with Message brokers: Kafka/ Redis;
β€’ Experience with Pandas;
β€’ Familiar with AWS infrastructure (boto3, S3 buckets, etc);
β€’ Experience working with large volumes of data;
β€’ Understanding the principles of medallion architecture.   

Will Be a Plus:
β€’ Understanding noSQL DBs (Elastic);
β€’ TimeScaleDB;
β€’ PySpark;
β€’ Experience with e-commerce or fintech.   
 

Key Responsibilities:

β€’ Develop and maintain a robust and scalable data processing architecture using Python.

β€’  Design, optimize, and monitor data pipelines using Kafka and AWS SQS.

β€’  Implement and optimize ETL processes for various data sources.

β€’  Manage and optimize SQL and NoSQL databases (PostgreSQL, TimeScaleDB, Elasticsearch).

β€’  Work with AWS infrastructure to ensure reliability, scalability, and cost efficiency.

β€’  Proactively identify bottlenecks and suggest technical improvements.

 

 We offer:

β€’  Working in a fast-growing company;

β€’  Great networking opportunities with international clients, challenging tasks;

  • Personal and professional development opportunities;
  • Competitive salary fixed in USD;
  • Paid vacation and sick leaves;
  • Flexible work schedule;
  • Friendly working environment with minimal hierarchy;
  • Team building activities, corporate events.


 

Required skills experience

Python 4 years
PostgreSQL 3 years
Kafka 3 years
Python Pandas 3 years
AWS 3 years
SQL DBs 3 years

Required languages

English B2 - Upper Intermediate
Elasticsearch, TimeScaleDB, PySpark
Published 15 January
16 views
Β·
1 application
To apply for this and other jobs on Djinni login or signup.
Loading...