Data Engineer

Oneeven Investment Office Verified Employer Silver Tier Always answers

We are looking for a Data Engineer to join our team and build reliable data pipelines for current and upcoming analytics projects.
You will work with Python, Apache Airflow, ClickHouse, PostgreSQL, AWS S3, and Nextcloud to deliver clean, trusted datasets for BI/reporting and operational dashboards.
In addition to core ETL/ELT work, we need a strong agentic data engineering mindset: automating routine DE tasks (pipeline scaffolding, data checks, incident triage, and documentation) with AI-assisted workflows under clear quality and safety guardrails.
 

Requirements:
- 3+ years of hands-on Data Engineering experience
- Strong Python skills for data pipelines and automation
- Solid experience with Apache Airflow (DAG design, scheduling, monitoring, troubleshooting)
- Advanced SQL and practical experience with ClickHouse and/or PostgreSQL
- Experience with AWS S3, Nextcloud, and data ingestion from files/APIs
- Understanding of ETL/ELT patterns, incremental loads, idempotency, and partitioning
- Experience with data quality checks, logging, and alerting
- Experience (or strong practical interest) in agentic/AI-assisted engineering for automating repetitive workflows
- Good Git workflow and collaboration skills
- English: intermediate+ (written and spoken)
 

Responsibilities:
- Design, develop, and maintain Airflow DAGs for batch data pipelines
- Build and improve data ingestion from S3, Nextcloud, APIs, and relational databases
- Model and optimize tables in ClickHouse for performance and analytics use cases
- Implement and maintain data quality/reconciliation checks
- Monitor pipeline health, troubleshoot production issues, and improve reliability
- Collaborate with analysts/BI stakeholders to deliver marts and reporting datasets
- Document pipeline logic, dependencies, and operational runbooks
- Identify repetitive DE tasks and implement agentic automations to reduce manual effort while keeping strong reliability controls
 

Nice to have:
- Experience with custom Airflow operators/plugins
- Knowledge of Slack/Telegram alerting integrations
- Experience in high-volume domains (fintech, gaming, e-commerce)
- Understanding of CI/CD for data workflows and infrastructure-as-code
- Experience with Docker/Kubernetes
- Hands-on experience with LLM APIs/agent frameworks for internal data tooling (with evaluation, observability, and human-in-the-loop controls)
 

What we offer:

  • Competitive compensation package with performance-based incentives
  • Health Insurance compensation
  • Sports benefits
  • 21 working days of paid vacation per year
  • 25 paid sick days annually — no doctor’s note required
  • 3 additional paid days off per year for personal use
  • Partial reimbursement for psychological support sessions
  • $600 annual education budget
  • Paid business trips to our Cyprus HQ

Required skills experience

Airflow 3 years
APIs & Data Integration 3 years
ClickHouse 3 years
Python 3 years
Apache Airflow 3 years
PostgreSQL 3 years
ETL/ELT 3 years

Required domain experience

Gambling 1 year

Required languages

English B2 - Upper Intermediate
Ukrainian Native
Published 27 February
27 views
·
3 applications
To apply for this and other jobs on Djinni login or signup.
Loading...