We are looking for an experienced Data Engineer to leverage their skillset in a fast-paced financial trading environment. You will be responsible for designing, developing, and maintaining data pipelines and infrastructure across GCP , and Oracle-based systems, ensuring high-performance data flows that support trading, risk, and compliance functions. This role requires a deep understanding of cloud architecture, large-scale data processing, and a strong grasp of agile delivery practices.
About the client
An international consulting firm that helps companies of all sizes have a better impact on the world. Company capabilities focus on supporting the private and public sectors with their people, processes, and digital technology challenges.
What You Will Do
● Design, build, and maintain scalable data pipelines primarily within Google Cloud Platform
● Develop integration and transformation workflows between cloud data services and on-prem Oracle databases
● Work closely with trading, risk, and analytics teams to understand data requirements and deliver real-time and batch data solutions
● Optimise and monitor performance of data systems to support latency-sensitive trading applications
● Collaborate with cross-functional teams using Agile/Scrum methodologies to deliver business-critical data projects
● Ensure robust data governance, lineage, and compliance (including MiFID II, FCA, and other regulatory standards)
● Automate data workflows using Terraform, CI/CD pipelines, and containerisation tools (Docker/Kubernetes)
What you will bring
● Strong experience with Google Cloud Platform (GCP) e.g. BigQuery, DataPlex, DBT, Pub/Sub, Dataflow, S3, Glue, Redshift
● Expertise in Oracle SQL, PL/SQL, and working with complex stored procedures and large datasets
● Proficiency in programming languages such as Python, Java, or Scala
● Experience with streaming and messaging systems (e.g. Jenkins, GitLab), and container orchestration (Kubernetes)
● Experience working with Infrastructure as Code, preferably Terraform, to manage cloud data infrastructure.
● Deep understanding of data modelling, data warehousing, and ETL/ELT design patterns
● Familiarity with Agile development practices (Scrum, Kanban, Jira)
● Exposure to financial markets, trading systems, or related high-performance environments is a strong plus
Nice to have
● GCP certification (e.g., Professional Data Engineer or similar) will be considered a strong advantage.
● Experience with Amazon Web Services (AWS)
● Knowledge of regulatory reporting, market data, or trade surveillance systems
● Experience with Apache Airflow, DBT, or similar orchestration tools
● Understanding of data security practices and compliance frameworks