Senior DWH Engineer
π Why this role matters
This role is about much more than just building pipelines. You'll be laying the groundwork for data-driven decision-making company-wide β designing robust DWH architecture, fine-tuning performance, ensuring data quality, and driving automation. You'll transform complexity into clarity, raw data into actionable insight, and operational chaos into a streamlined system.
π Your daily adventures:
1. DWH architecture and design
Designing Data Warehouse architecture for current and future business needs
Development of schemes taking into account performance, scalability and support for data historicity (SCD, snapshot)
Defining standards and best practices for data storage, transformation and access
Participation in planning cloud migrations
2. Organization of ETL/ELT processes
Building, maintaining and optimizing ETL/ELT pipelines (Airflow, dbt, custom solutions)
Implementation of incremental updates, CDC, backfill, reprocessing
Control and automation of data lineage, logging, alerting
3. Performance optimization
Deep optimization of queries, tables, DAGs
Implementation of batches, indexes, materialized views, clustering
Server resource management, load monitoring and balancing
Building ETL process performance metrics and regular auditing
4. Data quality control and reliability
Implementation of data validation, anomaly detection, reconciliation
Setting up automatic tests for ETL processes
Managing backup & recovery policies
Identifying and eliminating problems with duplicates, null values, data drifts
5. Integration of new data sources
Evaluation and connection of external APIs, raw sources, third-party databases
Harmonization of formats, update frequency, transformation logic
Adaptation of database schemas to new sources without disrupting current processes
6. DevOps and automation
Automation of deployments, testing, CI/CD for data
Working with Docker, Kubernetes, cloud infrastructure (GCP, AWS, Azure)
Working with Terraform
Using GIT and code review processes to manage pipelines
7. Mentoring and coordination
Code review, support and training of junior/middle engineers
Implementation of documentation, templates, onboarding instructions
Collaboration with analysts, developers, BI team
Work with business customers to understand their needs and transform them into technical requirements
π§ What makes you a great match
- Proficient in SQL: complex queries, CTE, window functions, analytics
- Deep understanding of DWH concepts: ETL, ELT, Data Vault, Kimball, Star/Snowflake schema
- Experience with Airflow/dbt or other pipeline orchestrators
- Proficiency in one or more DWH platforms: BigQuery, Snowflake, Redshift, ClickHouse, Vertica, etc.
- Proficient use of GIT, experience in helping colleagues with Git flow (merge conflicts, rebase, pull requests)
- Knowledge of Python or other scripting languages for transformations
- Understanding of server infrastructure: basic skills in configuring, maintaining, monitoring resources and load control
- Experience with CI/CD, automated data testing
- Experience in implementing backfill and configuring backup & recovery
π What we offer
At Gypsy, we believe work should fuel your growth, not drain it. Hereβs what you can expect when you join the journey:
- ποΈ Time to recharge β 20 working days of paid vacation annually, plus 2 additional business days off each quarter.
- π Remote-first & flexible β Work from anywhere, with the freedom to shape your day.
- π» Modern equipment β Weβll set you up with everything you need to do your best work.
- π Growth support β English language training, a personal learning budget, career path clarity, and access to top industry events.
- π§ Well-being matters β Mental health support, fitness perks, and a culture that respects your time and boundaries.
- π¬ Empowered culture β Transparent communication, diverse perspectives, and impact-driven collaboration.