Senior Data + Infrastructure Engineer (Clickhouse, Kafka, dbt) $$$$

Venon.io Responds Quickly

We're a small, fast-moving team building a B2B SaaS product in the e-commerce analytics space. We're looking for a skilled Data & Infrastructure Engineer to help us scale our data pipelines, optimize our analytical engine, and maintain a rock-solid production environment.

 

About the role

You'll work closely with our technical team to ensure our data flows seamlessly from ingestion to insight. This isn't a role where you'll be handed a perfectly architected blueprint. We need someone who can take ownership of our infrastructure, make smart trade-offs between performance and cost, and communicate clearly when architecting complex systems.

 

Our stack is built to handle high-volume analytics at scale. We rely on ClickHouse for our heavy lifting, Kafka for real-time data streaming, and Kubernetes for orchestration. We use dbt to keep our transformations sane and manageable.

 

What you'll be doing

  • Scaling the Backbone: Managing and optimizing our ClickHouse clusters and Kafka pipelines to handle growing data volumes.
  • Infrastructure as Code: Maintaining and evolving our Kubernetes environment to ensure high availability and performance.
  • Data Modeling: Building and maintaining robust data models using dbt to support our analytics product.
  • Performance Tuning: Identifying bottlenecks in data ingestion and query performance, and implementing long-term fixes.
  • Reliability: Participating in technical discussions, code reviews, and debugging production infrastructure issues when they arise.

 

What we're looking for

  • 3+ years of professional experience in Data Engineering, DevOps, or Site Reliability Engineering.
  • Expertise in ClickHouse: You know how to optimize MergeTree engines and write efficient analytical queries.
  • Streaming Mastery: Hands-on experience deploying and managing Kafka clusters.
  • Orchestration Skills: Strong experience working with Kubernetes (K8s) in a production environment.
  • SQL & dbt: A deep understanding of SQL and experience using dbt for data transformation workflows.
  • Independence: Ability to manage your own time and take infrastructure projects from concept to completion.

 

Bonus points

  • Hetzner Cloud: Experience managing bare metal or cloud instances specifically within the Hetzner ecosystem.
  • Infrastructure as Code: Experience with Terraform, Pulumi, or Ansible.
  • Industry Context: Background in e-commerce, Shopify, or marketing analytics.
  • Backend Knowledge: Familiarity with Node.js/TypeScript to help bridge the gap between infra and the app layer.

 

The Process

The interview process consists of a 30-minute technical interview and, in the second round, a 1-hour coding/system design task focused on data architecture.

Required skills experience

ClickHouse 2 years
Apache Kafka 2 years

Required languages

English B2 - Upper Intermediate
Published 29 March
24 views
ยท
5 applications
To apply for this and other jobs on Djinni login or signup.
Loading...