Data Engineer

We’re looking for a Data Engineer who will design, build, and maintain reliable data pipelines, ensuring high data quality across multiple internal and external systems. You’ll work closely with product, analytics, and engineering teams to develop robust ETL processes and support data-driven decision-making.
 

Responsibilities:

  • Build and maintain ETL processes integrating data from various internal and external IT systems.
  • Design and implement efficient orchestration for ETL pipelines (Airflow/Prefect/dbt).
  • Manage and support webhook ingestion pipelines, ensuring reliability and deduplication.
  • Design and optimize SQL data marts for analytics and reporting.
  • Ensure data quality, detect anomalies, and prepare control reports.
  • Perform one-time data loads and backfills when needed.

Requirements:

  • Strong SQL skills (Postgres or similar): CTEs, window functions, query profiling, and optimization for large tables.
  • Proficiency in Python for production scripts and automation (pandas / pyarrow / requests / asyncio).
  • Hands-on experience in web scraping (Playwright / Selenium / Scrapy), proxy rotation, anti-bot & CAPTCHA bypass, incremental updates.
  • Experience with webhooks: ingestion design, idempotency/deduplication, retries, integrity and latency control.
  • Solid understanding of ETL/ELT orchestration (cron / Airflow / Prefect / dbt); incremental loads, monitoring, and notifications.
  • Working with APIs (REST / GraphQL) and data formats (JSON / CSV / Parquet).
  • Strong Data Quality mindset: validation tests, reconciliation, data contracts, troubleshooting financial metric discrepancies (bets / wins / GGR).

Nice to Have:

  • Experience implementing data quality metrics and data contracts (consistency, completeness).
  • Hands-on with Spark (PySpark), Airflow, S3, and data profiling tools (ydata-profiling / Jupyter).
  • Experience setting up monitoring and logging (Grafana).
  • Familiarity with popular formats: Parquet / CSV / JSON / Iceberg.
  • Experience working with BigQuery.
  • Basic BI tools knowledge (Power BI / Tableau / Metabase) for dashboard creation.

What We Offer:

  • Remote-first work format.
  • Flexible working hours.
  • Opportunity to be part of a rapidly growing iGaming product.

Required languages

English B1 - Intermediate
Russian B2 - Upper Intermediate
Published 24 October
61 views
·
11 applications
37% read
·
37% responded
Last responded 2 days ago
To apply for this and other jobs on Djinni login or signup.
Loading...