Gypsy Collective

DWH Engineer

Description

We are looking for a Middle/Senior DWH Engineer to design, build, and maintain scalable data warehouse solutions that power analytics and business decision-making. You will work across the full data lifecycle β€” from ingestion and modeling to optimization, reliability, and automation β€” collaborating closely with analysts, developers, and business stakeholders.

 

Requirements

 

  • Strong SQL skills: complex queries, CTEs, window functions, analytical queries (5+ years experience);
  • Knowledge of Python or other scripting languages for data transformations (3+ years experience);
  • Deep understanding of DWH concepts: ETL/ELT, Data Vault, Kimball, Star/Snowflake schemas (4+ years experience);
  • Experience with Airflow or other data pipeline orchestrators (3+ years experience);
  • Hands-on experience with modern DWH and query engines: BigQuery, Snowflake, Redshift, ClickHouse, Vertica, AWS Athena, Trino (2+ years experience);
  • Confident use of Git; experience with team workflows (pull requests, rebasing, merge conflict resolution) (5+ years experience);
  • Understanding of server and cloud infrastructure: basic skills in configuration, maintenance, monitoring, and load control (2+ years experience).

 

Nice to Have

  • Experience with CDC tools and streaming data sources;
  • Knowledge of Docker, Kubernetes, and Infrastructure as Code (Terraform);
  • Experience with cloud platforms: AWS, GCP, or Azure;
  • Familiarity with data governance, data cataloging, and lineage tools.

 

Responsibilities:

  • Design, build, and maintain scalable Data Warehouse architectures aligned with business needs;
  • Develop and optimize ETL/ELT pipelines using Python/Airflow and custom solutions;
  • Work with DWH\Datalake: PostgreSQL, Trino, BigQuery;
  • Implement incremental loads, CDC, backfills, and reprocessing strategies;
  • Optimize query performance, data models, and pipeline execution;
  • Ensure data quality through validation, automated testing, monitoring, and alerting;
  • Integrate new data sources (APIs, third-party systems, raw data) without disrupting existing pipelines;
  • Collaborate with analysts, engineers, BI teams, and business stakeholders to translate requirements into data solutions;
  • Mentor engineers, review code, and contribute to data standards and best practices.

Benefits

πŸ’Έ Flexible payment options: choose the method that works best for you;
🧾 Tax assistance included: we handle part of your taxes and provide guidance on the local setup;
🎁 Financial perks: Bonuses for holidays, B-day, work milestones and more - just to show we care;
πŸ“ˆ Learn & grow: We cover courses and certifications β€” and offer real opportunities to grow your career with us;
πŸ₯ Benefit Π‘afeteria: Choose what suits you β€” sports, language courses, therapy sessions, and more;
πŸŽ‰ Stay connected: From team-building events to industry conferences β€” we bring people together online, offline, and on stage;
πŸ’» Modern Equipment: We provide new laptops along with essential peripherals like monitors and headphones for a comfortable workflow;
πŸ•˜ Your schedule, your rules: Start your day at 9, 10, or even 11 β€” we care about results, not clock-ins.

Required languages

English A2 - Elementary
Ukrainian B1 - Intermediate
SQL, AWS, DWH, Data Warehouse
Published 16 February
15 views
Β·
3 applications
67% read
To apply for this and other jobs on Djinni login or signup.
Loading...