We are establishing brand new Data Analytics team for global range customer from retail area.

In essence, client is leveraging the Azure PaaS platform with all kind of (business)data to build an advanced analytics platform aiming at delivering better insights and applications to the business.

The platforms are continuously being enhanced to support (additional) CI/CD and validated learning environment for science, machine learning and AI capabilities for all areas customer-facing like digital omni-channel interaction and commerce, commerce relevance, personalisation, loyalty and marketing and non-customer-facing like assortment optimization, supply chain optimization, external parties and IoT.

We will be working on end to end functionality including architecture, data preparation, processing and consumption by systems.


Responsibilities:

As Data Engineer you'll be working with alongside data architects to take data throughout its lifecycle - acquisition, exploration, data cleaning, integration, analysis, interpretation and visualization. You will be creating the pipeline for data processing, data visualization, and analytics products, including automated services, and APIs.

You will be the go-to person for end-to-end data handling, management and analytics processes.

You will:
• Ingest data-sources into our data management platforms
• Structure data into a scalable and easily understood architecture
• Work in a multi-disciplined team where you'll turn data discoveries and ideas into models and insights. You'll find how to leverage the data and the models to create and improve products for our customers, in lean development cycles.
• Be able to implement/build methodologies as well as (understand how to) scale them together with the businesses;
• Maintain a good, current and demonstrable knowledge of adjacent application and market developments both for inspiration and for benchmarking the concepts.

Mandatory Skills Description:

Essential Experience Required
- Python/Pyspark
- Azure Databricks (it's not mandatory if candidate knows Pyspark)
- SQL
- 5+ years industrial experience in the domain of large-scale data management, visualization and analytics

Other qualifications
- Basic knowledge in Azure Data factory.
- MSc in a computational field or another relevant area
- Hands-on experience incl. solid programming to implement pipelines integrating database management systems, cleaning data and improving its data quality
- Expertise in advanced data modelling
- Experience with Microsoft data management tools and the Azure platform environment
- Curious, proactive, fast learner able to quickly picking-up new areas
- Experience with agile methodologies
- Perfect communication skills
- Hands-on!
- Can Do approach!
Nice-to-Have Skills:
• Working on cloud-based big data solutions using Hadoop/Spark;
• SSAS cube development;
• Enterprise BI reporting - Power BI;
• Azure DevOps - CI/CD.

About Luxoft

Luxoft is a high-end application outsourcing provider of choice and a trusted technology advisor to Global 2000 and medium-sized growth companies that apply compelling technologies to obtain leadership positions in their respective markets.
Luxoft today finds the Best talents, proposes career growth & employment benefits. Our teams are involved in high complicity & innovative projects for the Top leaders companies around the Globe.

Company website:
https://career.luxoft.com/locations/ukraine/

DOU company page:
https://jobs.dou.ua/companies/luxoft/

Job posted on 22 February 2021
3 views


Для отклика на эту и другие вакансии на Джинне войдите или зарегистрируйтесь.