Senior Data Engineer (ETL, ML Experience)

Senior Data Engineer (ETL, ML Experience)
Location: Remote (Europe preferred)
Contract Type: B2B
Experience: 7+ years as a Data Engineer
English Level: C1 (Advanced)
Compensation: Gross (to be specified)
Holidays: 10 public holidays per year (vacation and sick days unpaid)

About the Role
We are seeking a Senior Data Engineer with strong experience in ETL pipeline design, data analytics, and exposure to machine learning workflows. You will play a key role in designing, developing, and maintaining scalable data solutions to support analytics, reporting, and ML-driven decision-making.

You will work closely with data scientists, analysts, and software engineers to ensure data integrity, performance, and accessibility across the organization.

Key Responsibilities

  • Design, build, and maintain ETL/ELT pipelines for large-scale data processing, including Elasticsearch.
  • Develop, optimize, and manage data models, data warehouses, and data lakes.
  • Collaborate with cross-functional teams to define data architecture, governance, and best practices.
  • Implement and maintain CI/CD workflows using AWS CodePipeline.
  • Work with Python and .NET for automation, data integration, and application-level data handling.
  • Support data-driven decision-making through analytics and reporting.
  • Troubleshoot and optimize database performance and data processing pipelines.
  • Implement data quality and validation frameworks to ensure reliable data flow.

Required Skills & Experience

  • 7+ years of professional experience as a Data Engineer or similar role.
  • Strong expertise in ETL development and orchestration, including Elasticsearch.
  • Python โ€” Expert level (data processing, automation, APIs, ML pipeline integration).
  • ETL Tools / Frameworks โ€” Expert level (custom and/or AWS-native).
  • Data Analytics & Reporting โ€” Expert level (data modeling, KPI dashboards, insights generation).
  • DBA experience โ€” Experienced (database design, tuning, and maintenance).
  • AWS CodePipeline โ€” Experienced (CI/CD for data workflows).
  • .NET โ€” Experienced (integration, backend data logic).
  • Experience with data warehousing solutions (e.g., Redshift, Snowflake, BigQuery) is a plus.
  • Familiarity with machine learning data pipelines (feature engineering, data prep, model serving) is a plus.

Nice to Have

  • Experience with Airflow, DBT, or other orchestration tools.
  • Familiarity with Terraform or AWS CloudFormation.
  • Exposure to ML Ops and productionizing ML models.
  • Knowledge of data governance, security, and compliance standards.

Required languages

English C1 - Advanced
Published 27 October
28 views
ยท
4 applications
25% read
To apply for this and other jobs on Djinni login or signup.
Loading...