Senior Data Engineer (Azure Databricks)

to $5500

CrunchCode — міжнародна сервісна ІТ-компанія з досвідом близько 7 років у розробці вебсервісів і вебзастосунків. Ми працюємо у форматах staff augmentation (outstaff) та outsourcing і підключаємо спеціалістів до проєктів клієнтів у довгостроковій моделі співпраці.

Ми працюємо переважно з проєктами в доменах логістики (включно з last mile),e-commerce, fintech та банкінгу, а також enterprise-рішеннями.
Для нас важливо, щоб проєкт був “чистим” і зрозумілим з точки зору етики та цінності для користувачів.

Ми принципово не беремо проєкти, пов’язані з:
● gamblng / гемблінгом,
● adult-контентом та порнографією,
● шахрайством або будь-якою розробкою, що спрямована на обман чи маніпуляції.

What We Offer:
● Fully remote work
● Long-term, stable project
● High level of autonomy and trust
● Minimal bureaucracy
● Direct impact on business-critical logistics systems
● Long-term engagement, not a short-term contract.

Project Overview
Operational management and ongoing enhancement of an Azure Databricks-based Cloud Data Platform (CDP). The role acts as the link between cloud infrastructure and business-facing data solutions — ensuring the platform stays secure, reliable, and production-ready while delivering new integrations, interfaces, and data products. Split: ~70% platform operations, ~30% enhancement and delivery.

Tech Stack: Azure Databricks · Terraform · Python · Bash · Azure APIM · Bitbucket · GitHub · CI/CD · Jira
 

Requirements (Must-have):
- Proven experience operating cloud-based data platforms in production environments
- Strong hands-on Azure Databricks expertise — workflows/jobs, compute management, permissions, troubleshooting
- Experience with batch and/or streaming data pipelines — production-grade reliability
- Solid monitoring and observability skills — dashboards, alerts, health checks
- Security best practices — access control, secrets management, least-privilege, secure connectivity
- Strong ownership mindset and cross-team communication skills
- Fluent English

Responsibilities:
- Manage daily CDP operations — stability, security, governance, reliability, and cost efficiency
- Monitor platform and pipeline performance, enhance alerting and Data Health Monitoring
- Support operational requests — user access, configuration management, troubleshooting, incident resolution
- Coordinate production deployments and release activities for platform and data product changes
- Maintain secure integration with the broader application ecosystem — identity, secrets, network connectivity
- Continuously improve operational processes through automation, runbooks, and root-cause analysis
- Design, build, test, and release data integrations across multiple data products
- Develop and extend interfaces for downstream consumers via APIs and scalable consumption patterns
- Implement and manage role-based security and authentication for Databricks applications and datasets
- Contribute to metadata management, data discovery, and operational data quality initiatives

Nice to Have:
- Broader Azure ecosystem — storage/data lakes, networking, identity services
- Delta Lake optimization and operational tuning
- ITSM-style service request and change management experience
- Data governance concepts — metadata, lineage, data quality reporting

Hiring Process: 2–3 interviews → Offer

Required languages

English B2 - Upper Intermediate
Ukrainian Native
Published 15 May
11 views
·
0 applications
To apply for this and other jobs on Djinni login or signup.
Loading...