Senior Data Engineer

US Company is searching for a Senior Data Engineer. Interesting project, distributed team, full-time, an official contract. Remote work, CET business hours.

 

Brief project description:
The product is a complex platform with applications realized like SaaS, which perform multiple functions for improving Safety, Sustainability and Productivity (works planning / incident management / monitoring & analytics) on high-risky industries: oil & gas gathering and transportation, chemical, building, energy market.

 

Main Responsibilities:

  • Build and operate production data pipelines for product, analytics, and AI.
  • Integrate internal/external sources with strong quality, reliability, and SLAs.
  • Maintain and improve the data warehouse (performance, cost).
  • Model curated datasets/semantic layers and implement transformations as code (dbt or similar).
  • Ensure observability and operational readiness (tests, freshness checks, runbooks).
  • Support embedded Power BI in production (monitoring, incidents, improvements).
  • Partner with stakeholders and AI/ML teams to keep data current and clarify requirements.

 

Must-Have Requirements:

  • 5+ years in data engineering / data platform / analytics engineering roles.
  • Strong SQL and experience with relational databases and analytical warehouses/lakehouses.
  • Proficiency in Python (or equivalent) for data processing, automation, and APIs.
  • Experience with cloud platforms (AWS and/or Azure) and cloud-native data services.
  • Production experience with workflow orchestration (Airflow/dbt or similar).
  • Solid data modeling skills (dimensional modeling/star schema or equivalent).
  • Strong software engineering habits: Git, CI/CD, automated testing, readable maintainable code.
  • Clear communication in English and comfort explaining tradeoffs to non-experts.
  • BI enablement: Power BI / Tableau / Looker.

 

Considered as a BIG plus:

  • Experience with Snowflake /Databricks/ BigQuery or similar analytical engines.
  • Infrastructure-as-code: Terraform, plus Docker
  • Experience using AI coding assistants (e.g., GitHub Copilot, Cursor, Claude Code)
  • Familiarity with GenAI/LLM and machine-learning fundamentals

 

Work conditions:

  • Distributed team, remote work.
  • Kanban or scrum approach, 5-6 team members / team.
  • Full-time (40 hours per week).
  • Official contract: salary, sick-leave days, holidays, vacations.

 

Hiring process:
Step 1 - preliminary interview (main questions) - 30 mins
Step 2 - internal tech interview (tech questions) - 40-50 mins
Step 3 - tech interview with team leader and architect - 1 hour

Required skills experience

Azure 5 years
CI/CD 5 years

Required languages

English B2 - Upper Intermediate
Power BI, Tableau, Looker, Snowflake, Databricks/ BigQuery, Terraform, Docker
Published 12 February
19 views
ยท
4 applications
To apply for this and other jobs on Djinni login or signup.
Loading...