Senior Data Engineer

Role Overview
As a Data Engineer at ClubReady, you’ll be responsible for designing, maintaining, and optimizing the data pipelines and replication processes that power our customers’ access to their ClubReady data. You’ll collaborate closely with BI developers, DevOps, and Product to ensure that the data infrastructure is efficient, scalable, and easy to support. You’ll also play a key role in monitoring and troubleshooting customer data syncs, optimizing schema designs, and implementing new data delivery features.

 

Key Responsibilities

  • Develop and maintain data pipelines that replicate and transform ClubReady data into customer-facing databases.
  • Monitor and troubleshoot daily data loads, ensuring reliability and data consistency.
  • Optimize database performance across SQL Server and PostgreSQL environments (indexes, partitioning, query tuning).
  • Collaborate with BI and Product teams to design and implement new data sets and schema enhancements for analytics use cases.
  • Automate operational processes such as monitoring, alerting, and health checks for data syncs.
  • Ensure data security and compliance in all data handling and transfer processes.
  • Contribute to data documentation, pipeline design standards, and best practices.

 

Qualifications Required:

  • 3+ years of experience as a Data Engineer, ETL Developer, or Database Engineer.
  • Proficiency in SQL Server and PostgreSQL, including performance tuning and query optimization.
  • Experience building and maintaining ETL pipelines using tools such as Airflow, dbt, SSIS, or similar.
  • Strong understanding of data modeling, change data capture (CDC), and incremental load strategies.
  • Experience with Python or another scripting language for automation and data transformation.
  • Familiarity with cloud environments (AWS or Azure) and data warehouse design principles.

 

Preferred:

  • Experience with replication systems or data synchronization frameworks between operational and analytics databases.
  • Experience with Go programming language.
  • Experience with Snowflake.
  • Experience supporting multi-tenant or customer-facing data platforms.
  • Exposure to Looker, Power BI, or Tableau.

 

Work Hours:

This is a remote position, but you must be available during our operating hours in the USA time zone (12pm–8pm Kyiv time zone).

Required languages

English B2 - Upper Intermediate
Python, SQL, PostgreSQL, Azure, AWS, T-SQL, Data Warehouse, Data Engineer
Published 3 November
36 views
·
9 applications
34% read
·
34% responded
Last responded yesterday
To apply for this and other jobs on Djinni login or signup.
Loading...