Senior Data Engineer (GCP, Oracle)
Key details at a glance:
Client: International Consulting Firm (Public & Private Sector Focus)
Location: Poland only
Type: Full-time
Must have: fintech, GCP, BigQuery and DBT experience
Start: ASAP (in 1-2 weeks after approval)
Rate: ~4000-5000 USD gross (B2B)
About the Company
We’re an AI-first global tech company with 25+ years of engineering leadership, 2,000+ team members, and 500+ active projects powering Fortune 500 clients, including HBO, Microsoft, Google, and Starbucks. From AI platforms to digital transformation, we partner with enterprise leaders to build what’s next. At Exadel, our people are ambitious, collaborative, and constantly evolving.
About the Client
Our client is an international consulting firm dedicated to helping companies of all sizes achieve a better global impact. They focus on supporting organizations through people, process, and digital technology challenges.
What You’ll Do
- Pipeline Development: Design, build, and maintain scalable data pipelines primarily within Google Cloud Platform (GCP).
- Hybrid Integration: Develop integration and transformation workflows between cloud data services and on-premises Oracle databases.
- Cross-functional Collaboration: Work closely with trading, risk, and analytics teams to understand data requirements and deliver both real-time and batch data solutions.
- System Optimization: Optimize and monitor the performance of data systems to support latency-sensitive trading applications.
- Agile Delivery: Collaborate with teams using Agile/Scrum methodologies to deliver business-critical data projects.
- Governance & Compliance: Ensure robust data governance, lineage, and compliance with regulatory standards (e.g., MiFID II, FCA).
Automation: Automate data workflows using Terraform, CI/CD pipelines, and containerization tools like Docker and Kubernetes.
What You Bring
- Cloud Expertise: Strong experience with GCP (BigQuery, DataPlex, DBT, Pub/Sub, Dataflow) and familiarity with AWS components (S3, Glue, Redshift).
- Database Mastery: Expertise in Oracle SQL, PL/SQL, and working with complex stored procedures and large datasets.
- Programming Skills: Proficiency in Python, Java, or Scala.
- DevOps & Infrastructure: Experience with streaming/messaging systems (Jenkins, GitLab), container orchestration (Kubernetes), and Infrastructure as Code (Terraform).
- Engineering Fundamentals: Deep understanding of data modeling, data warehousing, and ETL/ELT design patterns.
- Methodology: Familiarity with Agile practices, including Scrum, Kanban, and Jira.
Industry Context: Exposure to financial markets, trading systems, or high-performance environments is a significant advantage.
Nice to Have
- GCP Professional Data Engineer certification.
- Direct experience with Amazon Web Services (AWS).
- Knowledge of regulatory reporting, market data, or trade surveillance systems.
- Experience with Apache Airflow or similar orchestration tools.
- Understanding of data security practices and compliance frameworks.
Required languages
| English | B2 - Upper Intermediate |