PulseRise Technologies

Databricks Developer

We are looking for three experienced Databricks Developers to support the development and enhancement of a modern Azure-based data platform. The role focuses on building reusable, scalable framework components and ensuring high-quality, standardized data pipeline development. You will work with Python/PySpark, Spark, and SQL to design robust ETL/ELT architectures with strong governance and performance optimization. The position also includes responsibility for CI/CD implementation using GitHub and GitHub Actions within Databricks environments. This opportunity is ideal for senior engineers who combine deep Spark expertise with structured engineering practices and cloud integration experience. You will collaborate within an international team and contribute to long-term platform scalability and maintainability.

 

Details

Location: Remote (first week onsite for onboarding in Linz, Vienna, or Budapest – candidate’s preference)

Work Model: Remote with initial onsite onboarding

Employment Type: Contract, Full-time

Start Date: 01 June 2026 (earlier by arrangement)

Duration: Until 31 December 2026 (extension possible)

Language Requirements: Business-fluent English (German is a plus)

 

Key Responsibilities

Develop and maintain reusable framework components for Databricks-based data pipelines using Python/PySpark and SQL

Design and build standardized ETL/ELT structures including logging, monitoring, and error handling mechanisms

Optimize Spark jobs for performance, scalability, and cost efficiency

Implement and maintain CI/CD pipelines using GitHub and GitHub Actions for Databricks workflows

Integrate Azure services, Unity Catalog, and streaming components into the platform architecture

Ensure high data quality, structured documentation, and maintainable deployment processes

Collaborate closely with architects and data platform teams to enhance overall platform capabilities

 

Requirements

Senior-level hands-on experience with Databricks

Strong Spark and Python / PySpark expertise

Excellent SQL skills

Practical experience with GitHub and CI/CD pipelines (GitHub Actions)

Experience working with Microsoft Azure

Knowledge of Data Vault modeling concepts

Experience with streaming technologies

Hands-on experience with Unity Catalog

Experience with Splunk or monitoring tools

Business-fluent English

 

Nice to Have

German language skills

Experience working in enterprise-scale data environments

Experience with platform standardization and reusable framework design

Required languages

English C1 - Advanced
Published 26 February
6 views
·
0 applications
To apply for this and other jobs on Djinni login or signup.
Loading...