Data Engineer (Python + Azure + Databricks)

Job Type: Full-time, Long-term

Location: EU / LATAM

English: Upper-Intermediate

Start: ASAP

 

 

About the Project

We are building a complete end-to-end Data Product in Databricks that enables the analysis and visualization of energy market price data, linking it directly to the client’s internal asset portfolio.

 

The platform will provide:

  • Analysis of price data across specific timestamps.
  • Connection between market prices and internal asset performance.
  • Insights into revenue and profit per asset.
  • Dashboards and reports for decision-making.

 

Responsibilities

  • Design and deliver a modern data infrastructure in Azure + Databricks.
  • Build scalable, reliable, and reusable data pipelines.
  • Integrate price data with internal asset data.
  • Enable visualization of results via dashboards/reports.
  • Ensure data harmonization, quality, integration, and security.
  • Collaborate in an agile, cross-functional team to support data strategy initiatives.

 

Must-Have Skills

  • Strong expertise in Python.
  • Proven hands-on experience with Azure (Data Engineering stack).
  • Databricks (end-to-end pipeline development & optimization).
  • Scalable and secure data architecture design.
  • Experience in data pipeline development (ETL/ELT).
  • Knowledge of data warehousing, data modeling, and relational databases.
  • Strong analytical and problem-solving mindset.

 

Nice-to-Have

  • Experience in the energy sector or related environments.

 

Role Purpose

This role is critical in building and maintaining a modern data infrastructure that supports Renewables’ mission to lead the energy transition. You will be responsible for designing and delivering the Price Data Product, connecting market prices with asset performance across the entire fleet.

Required languages

English B2 - Upper Intermediate
Published 3 September
26 views
·
14 applications
To apply for this and other jobs on Djinni login or signup.
Loading...