Senior GCP Data Engineer
We're looking for a Senior Data Engineer with GCP, BigQuery, and DBT experience to join a stable, long-term project for the biggest insurance tech company in the USA. We're building a completely new, advanced data platform based on the lake house architecture, using bronze/silver/gold layers, and data model layer. You'll work with advanced technologies and will be able to work with one of the best GCP data architects in the World.
About Company
Our client is a large USA product company, a global leader in insurance technologies, and is seeking an experienced Data Engineer with strong expertise in Google Cloud Platform (GCP). Join us in scaling our Data and Analytics capabilities to drive data-informed decisions across our organization. You will design, build, and maintain efficient data pipelines, optimize data workflows, and integrate data seamlessly from diverse sources.
What You Will Do:
- Design, develop, and operationalize robust, scalable data pipelines.
- Develop BigQuery procedures, functions, and SQL objects.
- Optimize ETL processes for efficiency, scalability, and performance.
- Create production data pipelines using GCP (BigQuery, Dataflow, DBT), Python, SQL, Apache Airflow, Celigo, etc.
- Deploy streaming and batch jobs on GCP (Cloud Dataflow, Java/Python).
- Build ETL frameworks with reusable components and automated quality checks.
- Develop and maintain scalable data models and schemas for analytics and reporting.
- Implement performance tuning, capacity planning, and proactive monitoring/alerting.
- Ensure rigorous data quality through testing and validation.
- Promptly troubleshoot and resolve data-related issues.
- Maintain thorough technical documentation.
- Stay current with industry trends to improve engineering practices.
What You Need to Succeed:
- 5+ years in Data/ETL Engineering and Architecture, with at least 2 years on GCP.
- Proven expertise in building cloud-based data warehouses (preferably BigQuery).
- Hands-on experience with GCP services: DataProc, Dataflow, BigQuery, DBT.
- Proficiency in SQL, Python, Apache Airflow, Composer, and ETL tools (Talend, Fivetran).
- Experience using Git and DBT for version control and data transformation.
- Knowledge of Data Lake/Warehouse concepts and data modeling techniques (star schema, snowflake schema, normalization).
- Strong analytical and problem-solving skills.
- Excellent communication skills; ability to explain technical concepts clearly.
- Bachelor's degree in Computer Science, MIS, CIS, or equivalent experience.