Data Engineer (Snowflake, dbt, Airflow) - Middle, Senior

Short overview: 

Remote, full-time commitment, hourly payment, working mostly in the Kyiv time zone, but communication may stretch into EST for the calls. 


About the Project

You will be joining a data-focused project centered around building and maintaining a modern data platform. The project involves designing scalable data pipelines, developing a robust data warehouse, and enabling reliable analytics through well-structured data models.

The work requires to have strong Python skills and includes hands-on development with Snowflake, dbt, and Apache Airflow. 

Requirements:
·    experience in data engineering, software engineering, or a related role.
·    Strong proficiency in Python and SQL.
·    Experience building and operating production-grade data pipelines.
·    Proficiency in at least one additional language, such as Go or Java.
·    Deep hands-on experience with Apache Airflow.
·    Strong working knowledge of Snowflake.
·    Expert-level experience with dbt (Core & Cloud).
·    Strong experience with Kafka and streaming systems.
·    Experience designing and maintaining REST APIs.
·    Strong understanding of modern data architectures.
·    Experience with medallion architecture and dimensional modeling.
·    Experience implementing CI/CD pipelines for data workflows.
·    Experience working in cloud environments, preferably AWS.

Nice to Have
·    Familiarity with Docker and Kubernetes.
·    Experience with ClickHouse or other OLAP databases.
·    Experience with Airbyte, Airbyte, or similar integration tools.
·    Familiarity with data catalogs, lineage, or metadata management tools.
·    Experience enabling self-service analytics.
 

Required languages

English B2 - Upper Intermediate
Python, SQL, Snowflake, Apache Airflow., dbt, REST APIs., AWS, Java, Go
Published 2 February
42 views
·
7 applications
To apply for this and other jobs on Djinni login or signup.
Loading...