Principal Data Engineer
The Role
The Principal Data Engineer will support the Product Data Domain teams. You will build ETL pipelines to ingest and transform data to develop the data products that will power key value use cases across the company. You will work in an agile multi-disciplinary team alongside product analytics developers, product data managers, data modelers and data operations managers, ensuring that all work delivers maximum value to the company.
Role and Responsibilities
- Leads and architects on developing robust and scalable complex data pipelines to ingest, transform, and analyse large volumes of structured and unstructured data from diverse data sources. Pipelines must be optimised for performance, reliability, and scalability in line with the client scale.
- Lead initiatives to enhance data quality, governance and security across the organisation, ensuring compliance with client guidelines and industry best practices.
- Prioritises stakeholders requirements and identify the best solution for timely delivery.
- Leads on building automation workflows including monitoring and alerting.
- Encouraging and mentoring team members in partnership with other disciplines to create value with data across the wider organisation.
- Helps set standards for coding, testing and other engineering practices.
- Leads on the building and testing of business continuity & disaster recovery procedures per requirements.
Proactively evaluates and provides feedback on future technologies and new releases/upgrades based on deep understanding of the domain.
Are you the right candidate?
When it comes to data engineering, we look for the following skills.
Technical skills
- Extensive (5+ years) experience in a data engineering or analytics engineering role, preferably in digital products.
- Extensive experience in building ETL pipelines, ingesting data from a diverse set of data sources (including event streams, various forms of batch processing)
- Excellent SQL and python skills.
- Extensive use of AWS
- Good working knowledge of Data Warehousing technologies (such as AWS Redshift, GCP BigQuery or Snowflake).
- Experience in deploying and scheduling code bases in a data development environment, using technologies such as Airflow.
- Demonstrable experience of working alongside cross-functional teams interacting with Product Managers, Infrastructure Engineers, Data Scientists, and Data Analysts.
Required languages
| English | B2 - Upper Intermediate |
SQL, Python, Data Engineer, AWS, Airflow, ETL/ELT, Redshift, BigQuery, GCP
Published 24 February
19 views
ยท
2 applications
๐
$3700-5500
Average salary range of similar jobs in
analytics โ
Loading...