Data Engineer with GCP
We are looking for an experienced Data Engineer to leverage their skillset in a fast-paced financial trading environment. You will be responsible for designing, developing, and maintaining data pipelines and infrastructure across GCP , and Oracle-based systems, ensuring high-performance data flows that support trading, risk, and compliance functions. This role requires a deep understanding of cloud architecture, large-scale data processing, and a strong grasp of agile delivery practices.
About the client
An international consulting firm that helps companies of all sizes have a better impact on the world. Company capabilities focus on supporting the private and public sectors with their people, processes, and digital technology challenges.
What You Will Do
โ Design, build, and maintain scalable data pipelines primarily within Google Cloud Platform
โ Develop integration and transformation workflows between cloud data services and on-prem Oracle databases
โ Work closely with trading, risk, and analytics teams to understand data requirements and deliver real-time and batch data solutions
โ Optimise and monitor performance of data systems to support latency-sensitive trading applications
โ Collaborate with cross-functional teams using Agile/Scrum methodologies to deliver business-critical data projects
โ Ensure robust data governance, lineage, and compliance (including MiFID II, FCA, and other regulatory standards)
โ Automate data workflows using Terraform, CI/CD pipelines, and containerisation tools (Docker/Kubernetes)
What you will bring
โ Strong experience with Google Cloud Platform (GCP) e.g. BigQuery, DataPlex, DBT, Pub/Sub, Dataflow, S3, Glue, Redshift
โ Expertise in Oracle SQL, PL/SQL, and working with complex stored procedures and large datasets
โ Proficiency in programming languages such as Python, Java, or Scala
โ Experience with streaming and messaging systems (e.g. Jenkins, GitLab), and container orchestration (Kubernetes)
โ Experience working with Infrastructure as Code, preferably Terraform, to manage cloud data infrastructure.
โ Deep understanding of data modelling, data warehousing, and ETL/ELT design patterns
โ Familiarity with Agile development practices (Scrum, Kanban, Jira)
โ Exposure to financial markets, trading systems, or related high-performance environments is a strong plus
Nice to have
โ GCP certification (e.g., Professional Data Engineer or similar) will be considered a strong advantage.
โ Experience with Amazon Web Services (AWS)
โ Knowledge of regulatory reporting, market data, or trade surveillance systems
โ Experience with Apache Airflow, DBT, or similar orchestration tools
โ Understanding of data security practices and compliance frameworks
Required languages
| English | B2 - Upper Intermediate |