Senior Data Engineer
We are looking for a highly experienced Senior Data & AI Cloud Engineer to lead complex data platform migrations from legacy or multi-cloud environments to Google Cloud Platform (GCP). This role is ideal for a senior-level data engineer with deep cloud expertise, strong architectural thinking, and hands-on experience with large-scale enterprise data ecosystems.
You will drive high-impact migration initiatives, redesign modern data platforms, and contribute to key architectural decisions. This is a strategic technical role requiring strong ownership, leadership, and technical excellence.
Who Will Succeed in This Role:
- Senior Data Engineers with strong GCP knowledge.
- Cloud Data Engineers / Data Architects specialized in migration projects.
- Engineers who have participated in full data platform modernization initiatives.
- Candidates comfortable with both hands-on development and architectural leadership.
What You Will Do:
Migration & Architecture:
- Lead the end-to-end design and execution of data migration strategies from AWS/Databricks/Spark to GCP.
- Define roadmap and architecture for BI migration, including dashboards and semantic layers (e.g., QuickSight β BigQuery-based BI).
- Refactor complex Spark/PySpark pipelines into GCP-native solutions (BigQuery SQL, Dataflow/Beam, Dataproc Serverless).
- Oversee the migration of large-scale DBT projects (50β60+ interdependent models) ensuring functionality parity.
Data Quality & Validation:
- Create and implement a comprehensive data validation and reconciliation framework.
- Conduct advanced comparative testing between legacy systems and GCP targets.
Infrastructure & Automation:
- Build and maintain infrastructure using Terraform for all GCP data platform resources.
- Develop automated CI/CD pipelines using GitLab for deployment, testing, and data model validation.
- Optimize orchestration using Airflow/Cloud Composer.
Technical Leadership:
- Provide mentorship to mid-level engineers.
- Support the team with code reviews, migration tasks, documentation, and validation efforts.
- Participate in PoC phases and shape technical decisions (e.g., lift-and-shift vs. native refactoring).
What Weβre Looking:
For Must-Have Experience
- 5+ years as a Data Engineer working on enterprise-scale data platforms.
- 3+ years hands-on experience with GCP data stack: BigQuery, Dataflow / Apache Beam, Cloud Composer, Vertex AI, Dataplex.
- Proven experience migrating data platforms from Databricks/Spark to GCP.
- Strong expertise in DBT (dbt-core, dbt-bigquery) with large projects (50+ models).
- Excellent Python skills (including pandas, numpy) and advanced SQL optimization abilities.
- Experience converting PySpark logic into Dataflow/Beam pipelines.
- Strong Terraform experience for GCP infrastructure.
- Solid experience with CI/CD (GitLab), Docker, and container-based workflows.
- Hands-on experience with Apache Airflow.
Nice to Have:
- Experience with BI migrations (QuickSight β BigQuery ecosystem).
- Experience with Dataproc Serverless or other managed Spark environments.
- Understanding of data governance frameworks on GCP.
Working conditions:
- Paid vacation (15 business days per year);
- Flexible working hours (8 hours a day);
- Paid workshops, conferences, courses
Required skills experience
| Data Science | 5 years |
| BigQuery | 3 years |
| Dataform | 3 years |
| Apache | 3 years |
Required languages
| English | B2 - Upper Intermediate |