Data Architect
Project tech stack: Python, Snowflake, Azure, Azure Data Factory, AWS
ABOUT THE ROLE
This is an exciting opportunity to work on a high-impact project, architecting an end-to-end data solution in a collaborative and forward-thinking environment. The ideal candidate will be at the forefront of delivering scalable, efficient, and best-in-class Data engineering solutions supporting business-critical insights and reporting capabilities.
About the role
We are seeking a Lead/Architect Data Engineer to lead the design and implementation of robust data pipelines and warehouse solutions leveraging Snowflake, Azure, and Azure Data Factory. This role will focus on ingesting and transforming data from marketing and sales systems, enabling advanced analytics and reporting capabilities. The candidate will play a key advisory role in defining and implementing best practices for data ingestion, transformation, and reporting.
About the project
Our client is a global real estate services company specializing in the management and development of commercial properties. Over the past several years, the organization has made significant strides in systematizing and standardizing its reporting infrastructure and capabilities. Due to the increased demand for reporting, the organization is seeking a dedicated team to expand capacity and free up existing resources.
Location
Remote: Ukraine / Europe
Skills & Experience
- Bachelor's degree in Computer Science, Engineering, or related field;
- 7+ years of experience in data engineering/ architecture roles;
- Database management and SQL proficiency, Knowledge of modern data warehousing tools like Snowflake, Databricks, Redshift (hands-on);
- Data modeling and design;
- Programming skills (Spark, Python);
- Experience of data governance frameworks and compliance requirements;
- Knowledge of big data technologies, machine learning integration or API development;
- Proficiency with cloud platforms ( at least one AWS, Azure, GCP) for scalable solutions;
- Expertise in streaming pipeline design and complex data transformation;
- Ability to analyze system requirements and translate them into effective technical designs;
- Experience with performance optimization for large-scale databases;
- Understanding knowledge of CI/CD practices;
- Problem-solving mindset to address technical challenges in dynamic environments;
- Collaboration skills to work effectively with cross-functional teams;
- Expertise in using and/or introducing AI-based coding practices to the projects.
Responsibilities
- Design and maintain the organization’s data architecture, including databases, data warehouses, and data lakes;
- Develop and implement data models to structure and organize data for storage and access;
- Design data pipeline architectures to handle real-time data processing;
- Define and implement Changed Data Capture (CDC) pipelines;
- Ensure data security, integrity and compliance, and assist in implementing data governance practices;
- Monitor system health, identify bottlenecks, and recommend improvements to ensure scalability and efficiency;
- Collaborate with cross-functional teams on data relation topics;
- Stay updated on emerging technologies to continuously improve the organization's data infrastructure.