Data Analyst (offline)

Description:
About client:
Our Client is the fast-paced digital healthcare company which creates the growing portfolio of online health communities for people with chronic conditions. Company’s mission is to improve patient’s quality of life by connecting them with each other, with caregivers and healthcare industry partners, building beneficial social interactions and meaningful health conversations.

About project:
The core of the project is working with the data; the target is to re-design, transition to cloud and enhance data-driven solutions. The team is responsible for expanding and optimizing the data flow and data pipeline architecture, model and analyze data, interpret trends or patterns in complex data sets and translate them into product and marketing insights.

About position:

The Data Quality Engineer is responsible for designing, developing, documenting and performing data quality checks across all data assets. That includes ETL jobs, reports, dashboards, data pipelines and data applications. The primary goal for this role is to ensure high quality of data delivered to internal stakeholders and customers. Validation of data in data repositories against data from source systems and validation of metrics and data in reports/dashboards against data in the repositories is a key responsibility. Principal responsibilities are to make data assets consistently accurate for users.

Requirements:

● Working knowledge of SQL

● Understanding of data warehouses and business intelligence tools

● Have 2+ years experience as a data analyst

● Have experience writing analytical Python

● QA knowledge



Nice to have:

● Experience with Looker

● Experience with dbt

Preferences:
Git

Responsibilities:
Responsibilities:

Design, develop and maintain data quality assurance framework
Work in conjunction with BI, Data and Analytics Engineers to ensure high quality Data Deliverables
Design and develop testing frameworks to test ETL jobs, BI reports and Dashboards and other data pipelines
Write SQL scripts to validate data in the data warehouse against the data in the source system(s)
Write SQL scripts to validate data surfacing in BI assets against the data sources
Track, monitor and document testing results
The development and maintenance of Extract Transform and Load (ETL) processes, database and performance administration, and dimensional design of the table structure
Work closely with Data Teams to understand operation of data warehousing functionality
Write high-quality, well-structured code that is maintainable and extensible
Analyze complex data systems to develop automated and reusable solutions for extracting requested information while assuring data validity and integrity
Perform tasks spanning the full lifecycle of data management activities with minimal supervision
Manage code using Git and other version control approaches as applicable
​​Maintain data standards, enforce standard development protocols, and analyze requirements to ensure technical and standard operating procedure impacts are considered
Develop and coordinate test plans
Perform ongoing monitoring and refinement of data platform

About GlobalLogic

GlobalLogic, a Hitachi Group Company, is a leader in digital engineering. We put people first. As part of our team, you will grow, be challenged, and expand your skill set working alongside highly experienced and talented people.

In Ukraine, GlobalLogic is:
- one of the TOP-3 largest IT companies
- 6,000+ professionals
- 90%+ of our projects involve complex R&D
- fully autonomous offices are located in Kyiv, Kharkiv, Lviv, and Mykolaiv, along with 10 temporary mini-offices across Ukraine

What is GlobalLogic in numbers:
- 29,000+ engineers
- 20+ countries
- 500+ active clients
- 50+ product engineering centers
- Headquartered in Silicon Valley

Company website:
https://bit.ly/GlobalLogic-Ukraine

DOU company page:
https://jobs.dou.ua/companies/globallogic/

The job ad is no longer active
Job unpublished on 11 February 2022

Look at the current jobs SQL / DBA Kyiv→