Lead Data Engineer for Just Eat (offline)

Just Eat Takeaway.com is a leading global online food delivery marketplace headquartered in Amsterdam and listed on the London Stock Exchange.

We've built our business on having the widest choice available on our platform – connecting millions of customers with over 155,000 restaurants across 24 countries - with over 100 different cuisines from local independents to globally famous restaurants, available to order via our app and website.

We provide the platform and tools to help independent restaurants move online and reach a significantly broader customer base – to generate increased orders and grow their businesses. We also provide the insights, advice, and support our growing community needs to satisfy customers and help raise standards across a vibrant takeaway sector.

We’re built to deliver behind the scenes too. To make Just Eat the great company it is, it takes a great team of people. This is why all of our colleagues are welcomed into a diverse and inclusive workplace where they feel they can belong. We're passionate about nurturing our people and offer a full programme of training and support to our employees – helping them to develop their careers in a way that suits them.

As our Lead Data Engineer, you’re part of the diverse Data Engineering team based at our Amsterdam HQ. The team’s goal is to support our existing platform, in addition to creating new components and integrations and ensuring production runs smoothly.

The main goal is to verify data integrity and transform it into the best format for each stakeholder, using the right tools to support their operational work.

Responsibilities:
Collect and transform unstructured data from different sources into a structured output – e.g. columnar databases, flat or Parquet/ORC files, NoSQL or streams
Use your skills and best practices to create reliable, frequently/continuously running pipelines
Create reusable, maintainable and scalable integrations & services, using a cutting-edge cloud infrastructure
Model and test data, implement proper logging and troubleshoot any issues swiftly
Collaborate effectively with team members and other stakeholders in Agile iterative processes

Requirements:
5+ years’ experience in handling data pipelines, data warehouses or other (preferably distributed) data stores
In-depth knowledge of Python, including 3+ years’ experience
Proficient in working with Airflow
Experience with MPP data warehouses like Redshift, Teradata, Snowflake and databases like Postgres, Oracle, MySQL etc.
Experience in the cloud is preferred (AWS, GCP, Azure)
Skilled in parsing structured and unstructured data. Knowledge of data warehousing is a plus
An independent way of working: you prefer deploying services on the cloud instead of waiting for a DevOps engineer to hand your servers
Passionate about clean code and ideas, and able to strike a balance between development speed and documentation + testing
Fluent English (written & spoken) and good communication skills

About Ciklum International

Ciklum (www.ciklum.com) is a leading global product engineering and digital services company, serving Fortune 500 and fast-growing organisations.

Headquartered in the UK, Ciklum has 4,000+ software developers, designers, product managers and data scientists around the world building tailored digital solutions that leverage emerging technologies. Ciklum specialises in enabling digital transformation for some of the largest household names in the digital economy.

The Company empowers its clients and people to exceed their potential and pursue the extraordinary.

Join one of the top 10 employers in Ukraine, according to Forbes.
Boost your skills and make a difference with cutting-edge projects, skilled colleagues and the latest tech stacks.

Company website:
https://www.ciklum.com/

DOU company page:
https://jobs.dou.ua/companies/ciklum/

The job ad is no longer active
Job unpublished on 15 July 2021

Look at the current jobs SQL / DBA Kyiv→