Python Databricks Engineer - Wroclaw

$$$
  • Project Description:

    We are looking for a Senior Data Engineer - Reporting & Process Optimization to support our regulatory reporting transformation initiative within an investment banking environment. The primary focus of this role is to maintain the current reporting system built in Python, while also leading the migration to Azure Databricks to improve performance, scalability, and efficiency.

    This role involves developing, optimizing, and automating reporting processes using PySpark and SQL, ensuring accurate and timely regulatory report generation. The successful candidate will work closely with risk teams, regulatory compliance teams, and data engineers to enhance reporting workflows and integrate the new system into our Azure cloud infrastructure.

    This is an exciting opportunity to work on critical financial reporting processes, contributing to regulatory compliance, data governance, and process efficiency improvements.

  • Responsibilities:

    - drive the modernization and scalability of the client, transaction, and regulatory reporting platforms, within Asset Management technology
    - implement high quality solutions that address business needs and adhere to industry standards
    - drive automation of manual processes to the next level
    - closely collaborate within an agile team and a global agile organization
    - translate high level business requirement into technical requirements addressing reliability, scalability, and performance needs

  • Mandatory Skills Description:

    - minimum 4+ years hands-on application development experience using Databricks with Python
    - experience with PySpark, Pandas, Numpy, unit testing
    - hands-On Experience on SQL Server/ PostgreSQL database
    practical experience with microservice architectures, CI/CD pipelines (preferably with GitLab)
    - good to have exposure to cloud native services on Azure
    experience in translating high-level business requirements into IT requirements
    - excellent communication skills

  • Nice-to-Have Skills Description:

    - bachelor's/master's degree or equivalent with a focus in software engineering
    - Azure Data Services
    - ETL & Data Processing,
    - Financial Data Processing, experience handling risk, compliance, and regulatory data in an investment banking environment.
    - Data Modeling, meta data management
    - Git, GitHub, GitLab,
    - Jenkins
    - Regulatory compliance knowledge: Basel, MiFID, GDPR
    - Big Data
    - Cloud security and access controls (IAM, RBAC)
    - Familiarity with Docker, Kubernetes, Apache

  • Languages:
    • English: C1 Advanced

Required languages

English B2 - Upper Intermediate
Python, Databricks
Published 13 April
27 views
ยท
0 applications
Last responded 4 hours ago
To apply for this and other jobs on Djinni login or signup.
Loading...