Big Data Engineer (offline)

OVERVIEW
More simple. More local. More digital. This is the slogan of our customer, Danish largest retail chain Coop. Established in 1896, Coop remains competitive by transforming its businesses to meet the needs and expectations of customers in the modern digital era.

Technologies: Databricks, DataFactory, DeltaLake, DataLake, Spark, Python, SQL, Azure DW, Azure.

KEY RESPONSIBILITIES
Design data model and develop integration and data processing ETL pipelines to feed data lake and large-scale data warehouse
Design and implement pipelines to detect anomalies in data quality
Work with industry experts to develop and support analytical solutions

REQUIREMENTS
2+ years of relevant industry experience
Strong knowledge of at least one of Python, Scala, SQL
Experience designing and deploying production systems with reliable monitoring and logging practices
Understand best practices in data quality and quality engineering
Good written and spoken English level

WOULD BE A PLUS
Experience with spark structured streaming
Experience with data mining and anomaly detection techniques
Knowledge of retail domain
Experience with PowerBI or other reporting tools

WE OFFER YOU
Stable projects and long term employment
4 weeks paid vacation
90 days of fully paid sick leaves
Office in the city center with park area around
No bureaucracy
Free accounting services
Free English courses

About InterLogic

InterLogic is a Danish owned IT-company in Ukraine, that provides outsourcing of programming

Company website:
https://www.interlogic.com.ua/

The job ad is no longer active

Look at the current jobs Data Science Lviv→