Data ETL Engineer
Domain: Banking and Finance
Location: Remote from Poland
Experience Required: 3+ years
Project Overview
We are looking for an experienced Data ETL Engineer to join our client’s project and help design, build, and maintain data pipelines within the Google Cloud Platform (GCP) ecosystem.
The role focuses on developing scalable ETL/ELT processes, improving data workflows, and ensuring stable and reliable data delivery for analytics and business applications.
In this position, the specialist will work with large-scale data environments, integrate data from multiple sources, and ensure efficient, secure, and high-quality data processing in line with modern DataOps and cloud engineering practices.
Technical Requirements
Must have:
- 3+ years of experience in SQL development and query optimization, especially in BigQuery
- Experience designing and implementing ETL/ELT pipelines and data transformation processes
- Practical experience with GCP data services such as BigQuery, Data Fusion, Cloud Composer / Airflow, or similar tools
- Hands-on experience with Data Vault modeling
- Programming experience in Python
- Familiarity with Terraform
- Experience with CI/CD pipelines and DevOps tools (e.g., Git, Jenkins, Ansible)
- Experience working in Agile teams and following DataOps practices
- Strong analytical thinking and problem-solving skills
Nice to have:
- Experience building data ingestion pipelines for formats such as CSV, JSON, and XML
- Experience integrating data from REST or SOAP APIs, SFTP servers, and enterprise systems
- Understanding of data contract best practices
- Experience with Java development or creating custom plugins for data integration tools
- Experience with continuous testing and delivery for cloud-based data platforms
- Strong communication and collaboration skills
- Ability to work independently and manage several tasks simultaneously
- Proactive mindset and structured approach to problem solving
- Willingness to continuously learn and improve technical skills
- Team-oriented attitude and ability to collaborate with cross-functional teams
Required Technical Skills
- SQL
- BigQuery
- ETL & Data Management Tools
- CI/CD
- Python
- Terraform
- Agile
Responsibilities
- Design, develop, test, and deploy data models and transformations in BigQuery using SQL and related tools
- Build and maintain ETL/ELT pipelines to transform raw and unstructured data into structured datasets using Data Vault modeling
- Integrate data from multiple sources, including on-premise systems, APIs, and cloud platforms
- Monitor and troubleshoot data pipelines to detect performance issues, failures, or data inconsistencies
- Optimize ETL/ELT processes to improve performance, scalability, and cost efficiency
- Implement business and technical requirements in data transformation workflows
- Ensure solutions meet non-functional requirements such as security, reliability, scalability, and compliance with IT standards
- Manage code repositories and CI/CD pipelines using tools like Git and Jenkins
- Collaborate with DevOps and data teams to enable automated deployment, testing, and monitoring
- Provide bug fixes, improvements, technical documentation, and support knowledge transfer to operational teams
Required languages
| English | B2 - Upper Intermediate |