Senior Data Engineer (Python + GCP + BigQuery)

to $5800

CrunchCode — міжнародна сервісна ІТ-компанія з досвідом близько 7 років у розробці вебсервісів і вебзастосунків. Ми працюємо у форматах staff augmentation (outstaff) та outsourcing і підключаємо спеціалістів до проєктів клієнтів у довгостроковій моделі співпраці.

Ми працюємо переважно з проєктами в доменах логістики (включно з last mile),e-commerce, fintech та банкінгу, а також enterprise-рішеннями.
Для нас важливо, щоб проєкт був “чистим” і зрозумілим з точки зору етики та цінності для користувачів.

Ми принципово не беремо проєкти, пов’язані з:
● gambling / гемблінгом,
● adult-контентом та порнографією,
● шахрайством або будь-якою розробкою, що спрямована на обман чи маніпуляції.

What We Offer:
● Fully remote work
● Long-term, stable project
● High level of autonomy and trust
● Minimal bureaucracy
● Direct impact on business-critical logistics systems
● Long-term engagement, not a short-term contract.

Tech Stack
Python, SQL, GCP, BigQuery, ETL/ELT, Data Modeling, DBT, Airflow / Cloud Composer
Duration: 9+ months

What We Offer:
● Fully remote work
● Long-term, stable project
● High level of autonomy and trust
● Minimal bureaucracy
● Direct impact on business-critical logistics systems
● Long-term engagement, not a short-term contract.

Requirements (Must-have):
- Strong hands-on Python experience for data engineering
- Strong experience with GCP and cloud-based data workflows
- Experience building and maintaining production data pipelines
- Strong SQL and data modeling skills
- Ability to work with large-scale or unstructured data
- Experience collaborating with data scientists and cross-functional engineering teams
- Ability to deliver practical, reliable solutions in a fast-evolving environment

Responsibilities:
- Build new data pipelines and adapt existing ETL/ELT workflows
- Move and organize data in cloud storage and analytical systems (BigQuery)
- Work closely with data scientists on tables, transformations, and methodology implementation
- Connect services and data sources required for the solution
- Optimize queries and support scalable data processing
- Work with large, unstructured datasets including voice-related data
- Help operationalize evolving architecture decisions into production-ready data flows
- Collaborate with engineering and data teams to ensure reliable execution

Nice to Have:
- Experience with DBT
- Experience with Airflow or Cloud Composer
- Experience with large, unstructured, or voice-related datasets
- Lead-level coordination experience
- Experience supporting production data workflows in rapidly evolving projects

Hiring Process:
- Intro call
- Technical discussion
- Offer
Start: ASAP

Required languages

English B2 - Upper Intermediate
Ukrainian Native
Published 14 April
21 views
·
4 applications
Last responded yesterday
To apply for this and other jobs on Djinni login or signup.
Loading...