dillotech.us

Joined in 2023
28% answers
We're a young reference-based outsourcing company in the USA with transparent terms for customers and sub-contractors.
  • · 38 views · 10 applications · 3d

    Senior GCP Data Engineer – Python

    Full Remote · Europe except Ukraine · Product · 5 years of experience · Upper-Intermediate
    We're looking for a Senior Data Engineer – DevOps with strong GCP and Python experience to join a stable, long-term project for the biggest insurance tech company in the USA. We're building a completely new, advanced data platform based on the lake house...

    We're looking for a Senior Data Engineer – DevOps with strong GCP and Python experience to join a stable, long-term project for the biggest insurance tech company in the USA. We're building a completely new, advanced data platform based on the lake house architecture, using bronze/silver/gold layers, and a data model layer. You'll work with advanced technologies and will be able to work with one of the best GCP data architects in the World.

     

    About Company

    Our client is a large USA product company, a global leader in insurance technologies, and is seeking an experienced Data Engineer with strong expertise in Google Cloud Platform (GCP). Join us in scaling our Data and Analytics capabilities to drive data-informed decisions across our organization. You will design, build, and maintain efficient data pipelines, optimize data workflows, and integrate data seamlessly from diverse sources.

     

    What You Will Do:

    • Build and maintain CI/CD pipelines to enhance productivity, agility, and code quality.
    • Optimize data pipelines and workflows for performance and scalability.
    • Design efficient processes to minimize data refresh delays, leveraging reusable components and automated quality checks.
    • Develop robust, scalable data pipelines supporting business needs.
    • Code BigQuery procedures, functions, and SQL database objects.
    • Monitor application performance, troubleshoot issues, and implement effective monitoring and alerting.
    • Lead design and build-out of production data pipelines using GCP services (BigQuery, DBT, Apache Airflow, Celigo, Python).
    • Ensure data quality through rigorous testing and validation.
    • Maintain thorough technical documentation and stay current with industry trends.
       

    What You Need to Succeed:

    • 8+ years in Data/ETL Engineering, Architecture, and pipeline development, with at least 2 years focused on GCP.
    • Proven experience building scalable cloud Data Warehouses (preferably BigQuery).
    • 3+ years advanced SQL and strong Python or Java programming experience.
    • Extensive experience optimizing ETL/ELT pipelines, data modeling, and schema design.
    • Expertise with GCP services: Composer, Compute, GCS, Cloud Functions, BigQuery.
    • Proficiency in DevOps tools (Git, GitLab) and CI/CD pipeline integration with GCP.
    • Strong automation scripting skills, especially with GCP Composer.
    • Solid understanding of Data Lake/Warehouse concepts and data modeling techniques (star schema, snowflake schema, normalization).
    • Excellent problem-solving skills; able to work independently and collaboratively.
    • Strong communication skills, capable of explaining technical concepts clearly.
    • Bachelor's degree in Computer Science, MIS, CIS, or equivalent experience.
    More
  • · 21 views · 4 applications · 3d

    Senior GCP Data Engineer – ETL

    Full Remote · Europe except Ukraine · Product · 5 years of experience · Upper-Intermediate
    We're looking for a Senior Data Engineer – ETL with strong GCP and Python or Java experience to join a stable, long-term project for the biggest insurance tech company in the USA. We're building a completely new, advanced data platform based on the lake...

    We're looking for a Senior Data Engineer – ETL with strong GCP and Python or Java experience to join a stable, long-term project for the biggest insurance tech company in the USA. We're building a completely new, advanced data platform based on the lake house architecture, using bronze/silver/gold layers, and a data model layer. You'll work with advanced technologies and will be able to work with one of the best GCP data architects in the World.

     

    About Company

    Our client is a large USA product company, a global leader in insurance technologies, and is seeking an experienced Data Engineer with strong expertise in Google Cloud Platform (GCP). Join us in scaling our Data and Analytics capabilities to drive data-informed decisions across our organization. You will design, build, and maintain efficient data pipelines, optimize data workflows, and integrate data seamlessly from diverse sources.

     

    What You Will Do:

    • Build and maintain CI/CD pipelines to enhance productivity, agility, and code quality.
    • Optimize data pipelines and workflows for performance and scalability.
    • Design efficient processes to minimize data refresh delays, leveraging reusable components and automated quality checks.
    • Develop robust, scalable data pipelines supporting business needs.
    • Code BigQuery procedures, functions, and SQL database objects.
    • Monitor application performance, troubleshoot issues, and implement effective monitoring and alerting.
    • Lead design and build-out of production data pipelines using GCP services (BigQuery, DBT, Apache Airflow, Celigo, Python).
    • Ensure data quality through rigorous testing and validation.
    • Maintain thorough technical documentation and stay current with industry trends.
       

    What You Need to Succeed:

    • 8+ years in Data/ETL Engineering, Architecture, and pipeline development, with at least 2 years focused on GCP.
    • Proven experience building scalable cloud Data Warehouses (preferably BigQuery).
    • 3+ years advanced SQL and strong Python or Java programming experience.
    • Extensive experience optimizing ETL/ELT pipelines, data modeling, and schema design.
    • Expertise with GCP services: Composer, Compute, GCS, Cloud Functions, BigQuery.
    • Proficiency in DevOps tools (Git, GitLab) and CI/CD pipeline integration with GCP.
    • Strong automation scripting skills, especially with GCP Composer.
    • Solid understanding of Data Lake/Warehouse concepts and data modeling techniques (star schema, snowflake schema, normalization).
    • Excellent problem-solving skills; able to work independently and collaboratively.
    • Strong communication skills, capable of explaining technical concepts clearly.
    • Bachelor's degree in Computer Science, MIS, CIS, or equivalent experience.
    More
  • · 79 views · 8 applications · 3d

    Senior Data Engineer – Java

    Full Remote · Europe except Ukraine · Product · 5 years of experience · Upper-Intermediate
    We're looking for a Senior Data Engineer – DevOps with strong GCP and Java experience to join a stable, long-term project for the biggest insurance tech company in the USA. We're building a completely new, advanced data platform based on the lake house...

    We're looking for a Senior Data Engineer – DevOps with strong GCP and Java experience to join a stable, long-term project for the biggest insurance tech company in the USA. We're building a completely new, advanced data platform based on the lake house architecture, using bronze/silver/gold layers, and a data model layer. You'll work with advanced technologies and will be able to work with one of the best GCP data architects in the World.

     

    About Company

    Our client is a large USA product company, a global leader in insurance technologies, and is seeking an experienced Data Engineer with strong expertise in Google Cloud Platform (GCP). Join us in scaling our Data and Analytics capabilities to drive data-informed decisions across our organization. You will design, build, and maintain efficient data pipelines, optimize data workflows, and integrate data seamlessly from diverse sources.

     

    What You Will Do:

    • Build and maintain CI/CD pipelines to enhance productivity, agility, and code quality.
    • Optimize data pipelines and workflows for performance and scalability.
    • Design efficient processes to minimize data refresh delays, leveraging reusable components and automated quality checks.
    • Develop robust, scalable data pipelines supporting business needs.
    • Code BigQuery procedures, functions, and SQL database objects.
    • Monitor application performance, troubleshoot issues, and implement effective monitoring and alerting.
    • Lead design and build-out of production data pipelines using GCP services (BigQuery, DBT, Apache Airflow, Celigo, Python).
    • Ensure data quality through rigorous testing and validation.
    • Maintain thorough technical documentation and stay current with industry trends.
       

    What You Need to Succeed:

    • 8+ years in Data/ETL Engineering, Architecture, and pipeline development, with at least 2 years focused on GCP.
    • Proven experience building scalable cloud Data Warehouses (preferably BigQuery).
    • 3+ years advanced SQL and strong Python or Java programming experience.
    • Extensive experience optimizing ETL/ELT pipelines, data modeling, and schema design.
    • Expertise with GCP services: Composer, Compute, GCS, Cloud Functions, BigQuery.
    • Proficiency in DevOps tools (Git, GitLab) and CI/CD pipeline integration with GCP.
    • Strong automation scripting skills, especially with GCP Composer.
    • Solid understanding of Data Lake/Warehouse concepts and data modeling techniques (star schema, snowflake schema, normalization).
    • Excellent problem-solving skills; able to work independently and collaboratively.
    • Strong communication skills, capable of explaining technical concepts clearly.
    • Bachelor's degree in Computer Science, MIS, CIS, or equivalent experience.
    More
  • · 20 views · 8 applications · 3d

    Senior GCP Data Engineer

    Full Remote · Europe except Ukraine · Product · 5 years of experience · Upper-Intermediate
    We're looking for a Senior Data Engineer with GCP, BigQuery, and DBT experience to join a stable, long-term project for the biggest insurance tech company in the USA. We're building a completely new, advanced data platform based on the lake house...

    We're looking for a Senior Data Engineer with GCP, BigQuery, and DBT experience to join a stable, long-term project for the biggest insurance tech company in the USA. We're building a completely new, advanced data platform based on the lake house architecture, using bronze/silver/gold layers, and data model layer. You'll work with advanced technologies and will be able to work with one of the best GCP data architects in the World.

     

    About Company

    Our client is a large USA product company, a global leader in insurance technologies, and is seeking an experienced Data Engineer with strong expertise in Google Cloud Platform (GCP). Join us in scaling our Data and Analytics capabilities to drive data-informed decisions across our organization. You will design, build, and maintain efficient data pipelines, optimize data workflows, and integrate data seamlessly from diverse sources.

     

    What You Will Do:

    • Design, develop, and operationalize robust, scalable data pipelines.
    • Develop BigQuery procedures, functions, and SQL objects.
    • Optimize ETL processes for efficiency, scalability, and performance.
    • Create production data pipelines using GCP (BigQuery, Dataflow, DBT), Python, SQL, Apache Airflow, Celigo, etc.
    • Deploy streaming and batch jobs on GCP (Cloud Dataflow, Java/Python).
    • Build ETL frameworks with reusable components and automated quality checks.
    • Develop and maintain scalable data models and schemas for analytics and reporting.
    • Implement performance tuning, capacity planning, and proactive monitoring/alerting.
    • Ensure rigorous data quality through testing and validation.
    • Promptly troubleshoot and resolve data-related issues.
    • Maintain thorough technical documentation.
    • Stay current with industry trends to improve engineering practices.

     

    What You Need to Succeed:

    • 5+ years in Data/ETL Engineering and Architecture, with at least 2 years on GCP.
    • Proven expertise in building cloud-based data warehouses (preferably BigQuery).
    • Hands-on experience with GCP services: DataProc, Dataflow, BigQuery, DBT.
    • Proficiency in SQL, Python, Apache Airflow, Composer, and ETL tools (Talend, Fivetran).
    • Experience using Git and DBT for version control and data transformation.
    • Knowledge of Data Lake/Warehouse concepts and data modeling techniques (star schema, snowflake schema, normalization).
    • Strong analytical and problem-solving skills.
    • Excellent communication skills; ability to explain technical concepts clearly.
    • Bachelor's degree in Computer Science, MIS, CIS, or equivalent experience.
    More
Log In or Sign Up to see all posted jobs