Data Engineer (GCP)

Ciklum is looking for an Expert Data Engineer to join our team full-time in Ukraine.

We are a custom product engineering company that supports both multinational organizations and scaling startups to solve their most complex business challenges. With a global team of over 4,000 highly skilled developers, consultants, analysts and product owners, we engineer technology that redefines industries and shapes the way people live.

 

Responsibilities:

  • Lead efforts to optimize BigQuery performance, identify and resolve inefficient queries, and implement best practices for data processing at scale
  • Proactively analyze SQL code for antipatterns and suggest refactoring to improve query execution, cost efficiency, and maintainability
  • Develop and implement strategies for BigQuery storage optimization, including data partitioning, clustering, and archiving
  • Collaborate extensively with various data teams to understand their pipelines, identify optimization opportunities, and provide guidance on efficient BigQuery usage
  • Design, build, and maintain robust and cost-effective data solutions that leverage BigQuery capabilities to meet evolving business needs
  • Provide technical leadership and mentorship to data engineers, fostering a culture of continuous improvement in data pipeline efficiency and quality
  • Develop and maintain documentation for BigQuery optimization techniques, best practices, and standard operating procedures
  • Participate in code reviews, ensuring adherence to BigQuery optimization principles and high-quality SQL code standards
  • Research and evaluate new BigQuery features and related technologies to enhance data processing capabilities and efficiency.

 

Requirements:

We know that sometimes, you can’t tick every box. We would still love to hear from you if you think you’re a good fit!

  • 5+ years of experience coding in SQL and Python with solid CS fundamentals including data structure and algorithm design
  • 2+ years of experience with BigQuery and solid understanding of how it works under the hood
  • 3+ years contributing to production deployments of large backend data processing and analysis systems as a team lead
  • 3+ years of experience in cloud data platforms (GCP)
  • Knowledge of professional software engineering best practices for the full software
  • Knowledge of Data Warehousing, design, implementation and optimization
  • Knowledge of Data Quality testing, automation and results visualization
  • Knowledge of BI reports and dashboards design and implementation (Tableau, Looker)
  • Knowledge of development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations
  • Experience participating in an Agile software development team, e.g. SCRUM

 

What`s in it for you?

  • Strong community: Work alongside top professionals in a friendly, open-door environment
  • Growth focus: Take on large-scale projects with a global impact and expand your expertise
  • Tailored learning: Boost your skills with internal events (meetups, conferences, workshops), Udemy access, language courses, and company-paid certifications
  • Endless opportunities: Explore diverse domains through internal mobility, finding the best fit to gain hands-on experience with cutting-edge technologies
  • Flexibility: Enjoy radical flexibility – work remotely or from an office, your choice

Required languages

English B2 - Upper Intermediate
Python, SQL, GCP, Data Engineer, Google BigQuery, GCP BigQuery, Big Query, BigQuery, Google Big Query, BigData
Published 16 September
9 views
·
0 applications
To apply for this and other jobs on Djinni login or signup.
Loading...