Jobs Kharkiv
1-
Β· 52 views Β· 3 applications Β· 9d
Data Engineer with Snowflake
Ukraine Β· 3 years of experience Β· Upper-IntermediateClient Our client is a global financial company offering independent asset allocation advice and access to top asset managers. They provide customized investment solutions for diverse clients, emphasizing dynamic asset allocation and fostering an...Client
Our client is a global financial company offering independent asset allocation advice and access to top asset managers. They provide customized investment solutions for diverse clients, emphasizing dynamic asset allocation and fostering an inclusive, collaborative culture.
Project overview
Design and development of data analytical solutions used for core business operations. Key components of the tech stack include Snowflake, Azure Data Factory, Python, and MySQL.
Responsibilities
- Design, develop, and maintain scalable data pipelines and workflows within Snowflake.
- Optimize data models and queries to ensure high performance and efficiency.
- Collaborate with data analysts, data scientists, and other stakeholders to understand data needs.
- Implement and monitor data ingestion, transformation, and integration processes.
- Ensure data quality, consistency, and security across data systems.
- Automate data workflows and build reusable components to improve development efficiency.
- Troubleshoot and resolve data pipeline issues and performance bottlenecks.
- Participate in code reviews, agile ceremonies, and contribute to continuous improvement initiatives.
- Stay updated with the latest Snowflake features and data engineering best practices.
Requirements
- Proven experience designing and optimizing data pipelines and models in Snowflake.
- Strong proficiency in SQL for complex query writing and performance tuning in cloud data warehouses.
- Solid understanding of ETL/ELT processes and data warehousing principles.
- Experience with cloud platforms such as AWS, Azure, or GCP and their data services.
- Programming skills in Python for data engineering tasks.
- Knowledge of data quality assurance, monitoring, and security best practices.
- Experience with version control systems such as Git.
- Ability to troubleshoot and optimize data pipelines and collaborate within agile teams.
Nice to have
- Experience with BI tools, streaming technologies, containerization, and CI/CD pipelines.