Senior Data Engineer (Data Lake + Python + SQL)

2K-Group Top Employer

We are looking for a Data Engineering Specialist who is passionate about building systems that shape business outcomes. This role will play a key role in designing, building, and optimizing the data platform.

You will drive and manage large-scale data migration and modernization initiatives, and work with cross-functional teams to deliver trusted, high-quality data that drives business intelligence, analytics, and innovation. 

As a Data Engineering Specialist, you will be at the forefront of building scalable pipelines, architecting data solutions, and ensuring the data platforms can support real-time insights and long-term growth. 

Responsibilities and Duties: 
• Take ownership of coding and solution design while collaborating with internal and external engineers on design, development, and deployment of data pipelines and solutions. 
• Design and implement end-to-end data migration strategies from legacy systems to modern cloud platforms. 
• Manage data architecture and data modernization initiatives, ensuring alignment with business goals, data governance policies, and performance requirements. 
• Develop and optimize ETL/ELT pipelines using modern tools and frameworks (e.g., Apache Spark, Databricks, Airflow, dbt). 
• Work with stakeholders to gather requirements and translate them into scalable and maintainable data solutions. 
• Ensure data quality, reliability, and integrity through robust testing and monitoring. 
• Drive best practices in DevOps for data – CI/CD. 
• Mentor and provide technical guidance to data engineers.

Qualifications Required
• 7+ years of relevant experience in Data Engineering / Analytics domain with at least 3 years in Digital Analytics 
• 7+ years of experience in SQL 
• 4+ years in Python 
• Experience with building a data lake over Cloud / Azure, ADF, Synapse, and Databricks – Delta Lake 
• Proven expertise with tools like ADF, Qlik Replicate, FiveTran for data integration and ETL processes. 
• Strong understanding of Python scripting and use of libraries (NumPy, Pandas) 
• Extensive hands-on experience with MS Fabric, including data warehousing, SQL optimization, and cluster management. 
• Strong knowledge of Big Data technologies such as Data Bricks, Event driven data processing using Function Apps/Lambda is preferred • Understanding of Web services (SOAP, XML, UDDI, WSDL) 
• Strong knowledge and experience in event driven architecture using standard message queues (i.e.RabbitMQ, SQS or Kafka) 
• Hands on Experience parsing NOSQL file systems such as JSON, XML, AVRO, Parquet. 
• Proven track record of using Apache Spark Data Bricks. 
• Hands-on experience with Azure Data Factory (ADF), Logic Apps, and Runbooks. 
• Exposure to traditional BI Tools (Tableau, Power BI, Qlik, SSRS, etc.) 
• Experience building web-services is a plus. 

Full-time remote job

Required skills experience

Data Engineering 7 years
SQL 7 years
Python 4 years
Datalake 5 years

Required languages

English B2 - Upper Intermediate
Published 10 March
31 views
·
3 applications
100% read
To apply for this and other jobs on Djinni login or signup.
Loading...