DataOps Engineer (Cracow)

to $5000

Who we are: Adaptiq is a technology hub specializing in building, scaling, and supporting R&D teams for high-end, fast-growing product companies in a wide range of industries.

About the Product: 
Bigabid's platform processes 50 TB+ of raw data daily, handles 4M+ requests/second, and reaches over 1 billion unique users weekly. As a DataOps Engineer, you will be the reliability guardian of this data ecosystem - catching issues before they hit production, responding to operational incidents, and building the automation and infrastructure that keeps everything running.

About the role:

We’re looking for a DataOps Engineer who will focus on data reliability, monitoring, and quality across production pipelines.
 

This role combines:

  • proactive work (building monitoring, improving pipelines, adding tests)
  • reactive work (handling alerts, debugging issues, fixing incidents)

You’ll be responsible for keeping data pipelines stable, accurate, and transparent for downstream teams.
 

What you’ll do:

  • Build and maintain monitoring & alerting for data pipelines (freshness, accuracy, health)
  • Detect anomalies and investigate issues before they affect business
  • Implement data quality checks, validation rules, and automated tests
  • Troubleshoot incidents, find root causes, and document solutions
  • Work with Airflow, Python, and SQL for operational tasks and fixes
  • Maintain metadata and documentation (tables, lineage, sources)
  • Improve performance and reliability of existing pipelines
  • Collaborate with Data Engineers and business stakeholders
     

Requirements:

  • 3+ years in Data Engineering / DataOps / similar roles
  • Strong Python + SQL
  • Hands-on experience with Airflow
  • Experience with PySpark
  • Experience with data warehouses (MySQL, Presto, Athena or similar)
  • Experience with monitoring / observability / alerting tools
  • Strong focus on data quality and reliability
  • English for daily communication
     

Nice to have:

  • DevOps or NOC background
  • Experience with data governance / metadata tools
  • Familiarity with tools like Great Expectations, dbt tests, Monte Carlo
     

Working setup:

  • Hybrid format: Kraków office (~10 days/month)
  • Small, experienced team with high ownership
  • English-speaking environment
     

What we offer:

  • Up to $5000 salary
  • 20 paid vacation days + public holidays
  • Fully covered accounting & legal support
  • Equipment and co-working support if needed
  • Regular compensation reviews

 

 

 

 

 

Required languages

English B2 - Upper Intermediate
Python, PySpark, SQL
Published 6 May
11 views
·
1 application
Last responded 5 hours ago
To apply for this and other jobs on Djinni login or signup.
Loading...