Data Engineer

$$$$

Data Engineer โ€” Remote (GMT +1-5)

 

๐Ÿ“ Remote | ๐Ÿ• One interview. That's it.

 

WHO WE ARE

 

SouthRivers Data, we're an engineering-led company - built by engineers, run by engineers. That means no explaining why tech debt matters, no translating "we need refactoring time" into business speak, and no decisions made by people who've never written a line of code. We get it, because we've been there.

 

We believe that a great engineer is defined by two things equally: technical mastery and the ability to work with people. One without the other isn't enough for us - and probably isn't enough for you either.

 

 

THE ROLE

 

We're looking for a Data Engineer to join our project and work on data-intensive client projects - think DWH modernization, data lakehouse architecture, and pipeline engineering

 

You'll be engaged on a project basis, with clear scope and a team that actually understands your work.

 

 

WHAT WE EXPECT

 

Required:

  • Strong Python and SQL - you write clean, production-grade code
  • Apache Spark - you know how it works under the hood, not just the API
  • Workflow orchestration with Airflow, Dagster, or equivalent
  • Hands-on experience with at least one major cloud: AWS / Azure / GCP
  • Solid understanding of Data Engineering principles, Data Modelling, dimensional design, and best practices
  • Experience with DWH and Lakehouse architectures โ€” you've designed or maintained one, not just heard the terms

 

Nice to Have:

  • dbt
  • CI/CD experience in a data context (GitHub Actions, etc.)

 

 

THE PROCESS

 

One 1-hour interview with us. Possibly one short call with the client. That's the entire process - we respect your time and we've structured things so we can make a decision fast.

 

No take-home assignments.

 

 

THE SETUP

 

  • Fully remote - work from wherever you are
  • Contract / B2B engagement
  • 2โ€“4 week onboarding notice window

 

 

If you've been burned by companies that talk about engineering culture but are run by spreadsheets - this is the alternative.

 

Apply or reach out directly. We read every message.

Required skills experience

SQL 3 years
Python 3 years
Apache Spark 3 years

Required languages

English B2 - Upper Intermediate
DWH, ETL, PostgreSQL, Data Warehouse, Data Lakehouse, Lakehouse, Apache Airflow
Published 10 April
5 views
ยท
1 application
To apply for this and other jobs on Djinni login or signup.
Loading...