Data Architect
Project tech stack: Snowflake, AWS, Python/dbt, DWH design & implementation of medallion architecture, strong integration experience, data modelling for analytical solutions, CI/CD
We are looking for a hands-on Data architect to build and scale a Snowflake-based data platform supporting Credit Asset Management and Wealth Solutions. The role involves ingesting data from SaaS investment platforms via data shares and custom ETL, establishing a medallion architecture, modeling data into appropriate data marts for exposing it to analytical consumptions.
Our client is a global real estate services company specializing in the management and development of commercial properties. Over the past several years, the organization has made significant strides in systematizing and standardizing its reporting infrastructure and capabilities. Due to the increased demand for reporting, the organization is seeking a dedicated team to expand capacity and free up existing resources.
Location
Remote: Europe (Poland, Romania, Spain, Portugal)
Skills & Experience
- Bachelor's degree in Computer Science, Engineering, or related field;
- 7+ years of experience in data engineering roles;
- Database management and SQL proficiency
- Strong experience in Snowflake, proven experience building scalable data pipelines into Snowflake, data shares and custom connectors.
- Proficiency with AWS platforms for scalable solutions, Azure is a plus;
- Expertise in streaming pipeline design and complex data transformation: hands on ETL/ELT experience (Workato strongly preferred) and proficiency in Python and/or dbt experience for transformations and testing
- Proven implementation of medallion architecture and data quality frameworks.
- Data modeling and design for analytical solutions
- Experience with data governance, data lifecycle management, cataloging, lineage, and access control design.
- Experience setting up IAM/role-based access, cost optimization, and CI/CD for data pipelines.
- Ability to analyze system requirements and translate them into effective technical designs;
- Experience with performance optimization for large-scale databases;
- Problem-solving mindset to address technical challenges in dynamic environments;
- Collaboration skills to work effectively with cross-functional teams;
- Expertise in using and/or introducing AI-based coding practices to the projects.
Nice to Have
- Domain exposure to credit/investments and insurance data
- Familiarity with schemas and data models from:BlackRock Aladdin, Clearwater, WSO, SSNC PLM
- Experience with Databricks, Airflow, or similar orchestration tools
- Prior vendor/staff augmentation experience in fast-moving environments
Responsibilities
- Design, build, and maintain scalable data pipelines into Snowflake using Workato and native Snowflake capabilities
- Pipeline design and complex data transformation,integrate heterogeneous vendor data via data shares and custom ETL
- Define, implement and enforce medallion architecture (bronze/silver/gold) and data quality checks.
- Collaborate with tech lead and business partners to define logical data marts for analytics and reporting.
- Set standards and best practices for CDC, data lineage, metadata management, and master data management (MDM);
- Define enterprise-level policies for data governance, security, privacy, and compliance, working closely with risk and security teams;
- Contribute to non-functional setup: IAM/role-based access, data cataloging, lineage, access provisioning, monitoring, and cost optimization.
- Operate effectively in a less-structured environment; proactively clarify priorities and drive outcomes.
- Collaborate closely with the team members and other stakeholders;
- Provide technical leadership across teams: guiding engineers, analysts, and scientists in adopting architecture standards;
- Document data pipelines, processes, and best practices;
- Evaluate and recommend new data technologies.
Required languages
English | B2 - Upper Intermediate |