On behalf of Just Eat Takeaway, Ciklum is looking for a Senior Scala Engineer to join the Data Systems – Core platform sub-team. You’ll have the opportunity to work with big data technologies, building scalable and reliable solutions to support real-time analytics, advanced data science and critical
Just Eat Takeaway.com is a leading global online food delivery marketplace headquartered in Amsterdam and listed on the London Stock Exchange.
We've built our business on having the widest choice available on our platform – connecting millions of customers with over 155,000 restaurants across 24 countries - with over 100 different cuisines from local independents to globally famous restaurants, available to order via our app and website.
We provide the platform and tools to help independent restaurants move online and reach a significantly broader customer base – to generate increased orders and grow their businesses. We also provide the insights, advice, and support our growing community needs to satisfy customers and help raise standards across a vibrant takeaway sector.
We’re built to deliver behind the scenes too. To make Just Eat the great company it is, it takes a great team of people. This is why all of our colleagues are welcomed into a diverse and inclusive workplace where they feel they can belong. We're passionate about nurturing our people and offer a full programme of training and support to our employees – helping them to develop their careers in a way that suits them.
Data Systems – Core platform team
Our team’s mission is to build base functionalities that power a leading data platform. Our portfolio includes a variety of highly critical tools including (but not limited to) Real-time Ingestion, GDPR, Metadata Catalogues, and Access Controls. Here are snippets of some of our portfolio
Real-time Ingestion – A generic data ingestion pipeline currently running up to 10k events per second from 2.5k topics.
Metadata Catalogues – Infer and store context around data (about 1 PB in 500 datasets) and keep it secure and searchable.
Our stack is primarily Scala using libraries like ZIO, Cats, Akka streams and Apache beam. Python is also used in lots of areas.
How We Do It
Our team is built on the following tenets:
Innovate: We are always learning, growing, inquisitive and keen on new technologies and open source tooling. We love like-minded engineers with a passion to keep our code-base and infrastructure best in class.
Build for Scale: All our tools and components are built for scale and we use Kubernetes and other tools to help us scale automatically.
Cloud-based: We use serverless technologies where possible to simplify our estate, technologies like BigQuery, PubSub, Dataflow and Cloud functions allow us to move quickly. In addition, we run a Kubernetes cluster on GKE with many workloads including instances of Apache Airflow.
DevOps culture: Everyone in the team contributes to infrastructure, we have a CI/CD pipeline and we define our infrastructure as code. Our stack includes terraforming,
Jenkins and Helm. Teams monitor their applications using Prometheus, Grafana and alert manager.
Collaboration & Ownership: All code is owned by the team and we have multiple avenues for collaboration – rotation, pairing and technical showcases. We also encourage team members to own their own code and promote self-governance.
The Data Systems team’s role is to build a transformational data platform in order to democratise data in Just Eat. Our team is built on the following ideals:
Open Data: We ingest all data produced across Just Eat using batch and real-time pipelines and make it available to every employee in Just Eat. This data is then used to drive analytics, business intelligence, data science and critical business operations
Self Service: We build tools, frameworks and processes to support self-service modelling and activation of data. Our goal is to empower our users to find, process and consume our data without barriers
Single Truth: We build services that host all metadata about Just Eat’s data in a single store and promote governance, data culture and Single Source of Truth
Intelligent Personalisation: We build and maintain a machine learning platform that supports data scientists in developing and deploying ML models at a production scale. This allows us to deliver insights, personalisation and predictions to our customers at scale
You are confident in a functional programming language like Scala both in and outside of the data domain
You love writing well tested, readable and performant code, capable of processing large volumes of data.
You love working with Cloud technologies and have experience in working with AWS, Azure or Google Cloud. We use Google Cloud with a mix of services – Kubernetes, Dataflow, PubSub etc
You can contribute to architecture discussions and influence peers and stakeholders to make better decisions
You have the inclination to collaborate and the ability to communicate technical ideas clearly
You understand the entire product development lifecycle, from coding to deployments, to monitoring, alerting etc… Our teams maintain all aspects of our product lifecycle, but we don’t expect everyone to be an expert in all of it
You understand the fundamentals of computing and distributed systems.
About Ciklum International
Ciklum is a top-five global Digital Solutions Company for Fortune 500 and fast-growing organisations alike around the world.
Our 3,000+ Developers located in the Delivery Centres across the globe, provide our clients with a range of services including outsourcing software development, Enterprise App Development, Quality Assurance, Security, R&D, Big Data & Analytics.
DOU company page:
Job posted on
15 April 2021