Cloud Data Engineer
PwC is a global network of more than 370,000 professionals in 149 countries that turns challenges into opportunities. We create innovative solutions in audit, consulting, tax and technology, combining knowledge from all over the world.
PwC SDC Lviv, opened in 2018, is part of this global space. It is a place where technology is combined with team spirit, and ambitious ideas find their embodiment in real projects for Central and Eastern Europe.
What do we guarantee?
- Work format: Remote or in a comfortable office in Lviv - you choose.
- Development: Personal development plan, mentoring, English and Polish language courses.
- Stability: Official employment from day one, annual review of salary and career prospects.
- Corporate culture: Events that unite the team and a space where everyone can be themselves.
As a global leader in IT services, PwC is supported by PwC Poland’s Data, Digital Solutions & AI team. We provide first-class services in all areas of Data Platforms, Data Analytics, Data Strategy and Data Management, just to name a few of our core capabilities. We support clients in the transition to and their adoption of modern data infrastructure, creation of new business models and streamlining their operational activities.
Your future role:
- Building data platforms with critical components including data lakehouses, data warehouses and data lakes using tools like MS Fabric, Azure Databricks and Azure Data Factory for extraction, loading and transformation;
- Defining the flow of data through data platform from raw ingestion to business-ready consumption using most suitable data architecture patterns (e.g. medallion, lambda);
Designing the migration plans for moving transactional and master data from legacy systems of record to systems of reference (new solutions); - Conducting comprehensive data assessments alongside Data Modelers and Analysts across multiple systems of record and data domains to identify and remediate data quality issues;
- Liaising with client senior stakeholders and supporting Senior Data Architects to gather functional requirements to design modern data architecture;
- Supporting Data Governance subject matter experts to define data standards and policies and specifically around Master Data Management and Data Quality;
- Some familiarizations with Data Science and Data Analytics use cases within the context of Consumer Goods, Retail, Manufacturing and/or Financial Service sectors will stand out.
Apply if you have:
- Minimum of 3 years in data engineering roles, although this can be considered alongside other relevant experience e.g., Business Intelligence, Data Science etc.;
- Hands‑on experience with data transformation platforms, with a strong preference for Azure‑based solutions such as Microsoft Fabric and Databricks; experience with other platforms (e.g. Snowflake, Azure Synapse, Google BigQuery) is also valued;
- Experience in building data warehouses, data lakes and data lakehouses using tools like Azure Data Factory and Azure Synapse Analytics;
- Familiarity with writing data transformation scripts using Python i.e. PySpark and SQL to query databases;
- Some experience in Data Engineering CI/CD practices and release management across Dev, QA and Production environments using tools like Azure DevOps, and/or Jira;
- Proficiency in Business English at B2 level or above, with confidence to write technical documentation e.g., designing documents with support from Project Manager;
Experience with new technologies and AI‑based tools demonstrated in your daily work (e.g., task automation, information analysis, content creation).
Nice to have:
- Cloud certifications in Azure, AWS, or Google Cloud;
- Data Governance certifications e.g., DAMA;
- Evidence of training using online platforms e.g. Coursera, Udemy, Datacamp etc.;
- Application of data modelling frameworks including Kimball dimensional modelling to design and build logical and physical data models;
- Interest and some experience of experimenting with the application of Agentic AI to automate the extraction, transformation and loading of data using services like Azure Foundry;
- Proven experience in conducting data assessments across multiple data domains, including Product, and Customer and systems of record e.g. ERPs and CRMs;
- Experience of working on Data Strategy and Governance projects with a focus on improving Data Quality across key business functions;
- Familiarity with agile project management methodologies like Scrum to enable and scale minimal viable products;
- Some exposure to Data Science and Data Analytics use cases, particularly in the applications of Machine Learning and Generative AI techniques to help improve business forecasting, and demand planning;
- Experience in supporting senior stakeholders including Chief Data Officers in medium to large sized businesses across sectors such as Manufacturing, Consumer Goods, and/or Financial Services.
Policy statements:
https://www.pwc.com/ua/uk/about/privacy.html
Required skills experience
| Microsoft Fabric | 3 years |
| Databricks | 3 years |
| Azure Data Factory | 3 years |
| Azure Synapse | 3 years |
| PySpark | 3 years |
| SQL | 3 years |
| CI/CD | 3 years |
Required languages
| English | B2 - Upper Intermediate |
| Ukrainian | B2 - Upper Intermediate |