Team with solid experience in automation, manual testing, as well as product development & management. We tailor best practices to meet specific business needs in a straightforward and effective manner. Our team enjoys taking on challenging projects and is open to expanding our current domain stack.
-
· 42 views · 7 applications · 5d
Data Engineer (Snowflake, dbt, Airflow) - Middle, Senior
Full Remote · Countries of Europe or Ukraine · 3 years of experience · English - B2Short overview: Remote, full-time commitment, hourly payment, working mostly in the Kyiv time zone, but communication may stretch into EST for the calls. About the Project You will be joining a data-focused project centered around building and...Short overview:
Remote, full-time commitment, hourly payment, working mostly in the Kyiv time zone, but communication may stretch into EST for the calls.
About the ProjectYou will be joining a data-focused project centered around building and maintaining a modern data platform. The project involves designing scalable data pipelines, developing a robust data warehouse, and enabling reliable analytics through well-structured data models.
The work requires to have strong Python skills and includes hands-on development with Snowflake, dbt, and Apache Airflow.
More
Requirements:
· experience in data engineering, software engineering, or a related role.
· Strong proficiency in Python and SQL.
· Experience building and operating production-grade data pipelines.
· Proficiency in at least one additional language, such as Go or Java.
· Deep hands-on experience with Apache Airflow.
· Strong working knowledge of Snowflake.
· Expert-level experience with dbt (Core & Cloud).
· Strong experience with Kafka and streaming systems.
· Experience designing and maintaining REST APIs.
· Strong understanding of modern data architectures.
· Experience with medallion architecture and dimensional modeling.
· Experience implementing CI/CD pipelines for data workflows.
· Experience working in cloud environments, preferably AWS.
Nice to Have
· Familiarity with Docker and Kubernetes.
· Experience with ClickHouse or other OLAP databases.
· Experience with Airbyte, Airbyte, or similar integration tools.
· Familiarity with data catalogs, lineage, or metadata management tools.
· Experience enabling self-service analytics.