Location: Ukraine -
Experience: 8+ years
Skills: GCP, ETL, SQL, Java

We are looking for an experienced Data Engineer able to lead a small team contributing to a project of modernization of the client's Data Warehousing.

The requirements for this position are the following:

Good understanding of data ingestion frameworks, data processing frameworks, pipeline orchestration technologies, data storage engines, data access/warehousing tools
Experience with the cloud services related or used for distributed data processing and scalable data pipelines, ETL, and data warehousing. Preference will be given to experience with GCP services such as: Cloud Storage, Cloud Data Fusion, Cloud Functions, BigQuery
Deep familiarity with SQL and various DBMS, MSSQL is preferable
Good scripting skills in a modern programming language (such as Python, Java, Scala, C#, etc.)
Ability to lead a small team responsible for ETL and data warehousing
Ability to work with client's high-level requirements, perform interpretation, analysis, implementation of the requirements, as well as demonstration of resulted solution
Working with Agile methodology, being able to act as a Scrum Master
Good spoken and written English

Will be a plus:

Experience with data streaming via Apache Kafka or similar services (AWS Kinenis, GCP PubSub)
Experience with data visualization tools like Tableau
Experience with Snowflake, or other distributed data warehousing solutions

Job Responsibilities

• Design and development of data management related solutions
• Create and maintain optimal data pipeline architecture; build data pipelines to pull data from different source systems; integrate, consolidate, transform, and cleanse data according to business rules, and structure and for use by analytics applications
• Review business requirements; map reports, data subjects, and entities to source systems and artifacts in the Enterprise Data Warehouse (EDW) and/or Data Lake (DL)
•Document requirements and business rules into appropriate technical specifications
• Plan, test and implement enhancements to the EDW and DL environments on a regular release cycle
• Participate in work planning and estimation
• Present technical solutions to various stakeholders
• Develop and maintain sufficient support documentation to facilitate day-to-day support of the EDW and DL environments

Department/Project Description
The client is an industrial asset disposition and management company, that sells heavy industrial equipment and trucks through auctions and other transactional channels. The program consisting of 20+ engineers will include Architecture, Platform and Data Management teams working from offshore and nearshore locations augmenting the existing onsite engineering team.

Keyskills - Must Have
Data Warehousing AWS Big Data SQL Java Apache Spark Solution Architecture Database Kafka AWS S3 AWS Lambda AWS Athena
Keyskills - Nice to Have
Confluent Kafka Snowflake T-SQL IntelliJ IDEA Jenkins Pipelines PostgreSQL

About GlobalLogic

GlobalLogic is a full-lifecycle product development services leader that combines chip-to-cloud software engineering expertise and vertical industry experience to help our customers design, build, and deliver their next-generation products and digital experiences. By leveraging Agile / Lean MVP methods, cutting-edge technologies, and an integrated approach to experience design and complex engineering, we empower global brands such as Microsoft, BMC, Coca Cola, Samsung, Physio Control, and Roku to develop the “next big thing” in their markets. GlobalLogic is headquartered in Silicon Valley and operates design and engineering centers around the world, where we are continuously recognized as a top innovator and employer by organizations like Zinnov and Glassdoor.

Company website:

DOU company page:

Job posted on 23 February 2021

Для отклика на эту и другие вакансии на Джинне войдите или зарегистрируйтесь.