Data Engineer
Key Responsibilities
- Architect, build, and maintain high-performance data pipelines and warehouses in Snowflake.
- Design and optimize dimensional data models (star & snowflake schemas) using Kimball methodology.
- Implement automated data quality checks, monitoring, and alerting.
- Partner with analytics, product, and engineering teams to translate requirements into technical solutions.
- Mentor junior engineers and champion data engineering best practices.
- Develop and optimize ETL/ELT workflows integrating multiple cloud and SaaS data sources.
- Continuously tune Snowflake performance and manage cloud resources efficiently.
- Document data architecture, transformations, and operational processes.
Required Qualifications
- 5+ years of experience in data engineering.
- 3+ years of hands-on Snowflake experience, including query and warehouse optimization.
- Strong skills in dimensional modeling and data warehouse design.
- Proficiency in SQL and at least one programming language (Python preferred).
- Experience with modern data pipeline tooling and orchestration.
- Solid understanding of data warehouse concepts and best practices.
Preferred Skills
- Experience with dbt (data build tool).
- Familiarity with AWS, Azure, or GCP (AWS preferred).
- Experience with Git-based workflows and CI/CD pipelines.
- Knowledge of data governance and security in the cloud.
- Exposure to BI tools such as Looker, Tableau, or Power BI.
Required languages
English | Native |
Python, SQL, Git, ETL, Data Engineer, AWS
๐
$4000-6000
Average salary range of similar jobs in
analytics โ
Loading...