Senior Data Engineer

Project Description
The project focuses on the modernization, maintenance, and development of an eCommerce platform for a large US-based retail company, serving millions of omnichannel customers weekly.

Solutions are delivered by several Product Teams working on different domains: Customer, Loyalty, Search & Browse, Data Integration, and Cart.

Current key priorities:

  • New brands onboarding
  • Re-architecture
  • Database migrations
  • Migration of microservices to a unified cloud-native solution without business disruption

Responsibilities

  • Design data solutions for a large retail company.
  • Support the processing of big data volumes.
  • Integrate solutions into the current architecture.

Mandatory Skills

  • Microsoft Azure Data Factory / SSIS
  • Microsoft Azure Databricks
  • Microsoft Azure Synapse Analytics
  • PostgreSQL
  • PySpark

Mandatory Skills Description

  • 3+ years of hands-on expertise with Azure Data Factory and Azure Synapse.
  • Strong expertise in designing and implementing data models (conceptual, logical, physical).
  • In-depth knowledge of Azure services (Data Lake Storage, Synapse Analytics, Data Factory, Databricks) and PySpark for scalable data solutions.
  • Proven experience in building ETL/ELT pipelines to load data into data lakes/warehouses.
  • Experience integrating data from disparate sources (databases, APIs, external providers).
  • Proficiency in data warehousing solutions (dimensional modeling, star schemas, Data Mesh, Data/Delta Lakehouse, Data Vault).
  • Strong SQL skills: complex queries, transformations, performance tuning.
  • Experience with metadata and governance in cloud data platforms.
  • Certification in Azure/Databricks (advantage).
  • Experience with cloud-based analytical databases.
  • Hands-on with Azure MI, PostgreSQL on Azure, Cosmos DB, Azure Analysis Services, Informix.
  • Experience in Python and Python-based ETL tools.
  • Knowledge of Bash/Unix/Windows shell scripting (preferable).

Nice-to-Have Skills

  • Experience with Elasticsearch.
  • Familiarity with Docker/Kubernetes.
  • Skills in troubleshooting and performance tuning for data pipelines.
  • Strong collaboration and communication skills.

Languages

  • English: B2 (Upper Intermediate)

Required languages

English B2 - Upper Intermediate
Published 11 September
12 views
ยท
1 application
100% read
ยท
100% responded
Last responded 3 days ago
To apply for this and other jobs on Djinni login or signup.
Loading...