Senior Data Engineer
Job Description
Required Qualifications
- 4+ years of experience in Data Engineering, with hands-on ETL development.
- Proven experience with Apache Airflow (DAG design, scheduling, monitoring), Apache NiFi.
- Strong experience with Snowflake architecture, data migration, and performance optimization.
- Proficient in SQL, Python, and working with REST APIs for data ingestion.
- Experience with cloud environments (Azure or AWS).
- Top-notch English written and verbal communication skills (a candidate will report to the USA-based PMO)
Preferred Qualifications
- Experience in data infrastructure modernization projects.
- Exposure to CI/CD practices and DevOps collaboration.
- Familiarity with integrating tools such as Azure DevOps, Snyk, OpsGenie, or Datadog
- Previous working experience with PAM or Identity Management Solutions
Job Responsibilities
In the capacity of Senior Data Engineer - you will be expected to:
- Design scalable data pipeline architectures to support real-time and batch processing.
- Deploy and configure Apache Airflow for the orchestration of complex ETL workflows.
- Develop Airflow DAGs for key integrations:
- Azure DevOps (Work Items, build/release pipelines, commit data)
- OpsGenie (incident and alert data) - Snyk (security vulnerability data)
- Datadog (infrastructure monitoring and logs) - Migrate the existing data warehouse infrastructure and historical data to Snowflake.
- Create documentation for data architecture, Airflow configurations, and DAGs.
- Collaborate with Engineering and DevOps teams to align on integration and deployment strategies.
Department/Project Description
The client is a product international company with a common goal to redefine the legacy approach to Privileged Access Management by delivering multi-cloud-architected solutions to enable digital transformation at scale. The client company establishes a root of trust and then grants the least privileged access just in time-based on verifying who is requesting access, the context of the request, and the risk of the access environment.
The client's products centralize and orchestrate fragmented identities, improve audit and compliance visibility, and reduce risk, complexity, and costs for the modern, hybrid enterprise. Over half of the Fortune 100, the world’s largest financial institutions, intelligence agencies, and critical infrastructure companies, all trust this company to stop the leading cause of breaches – privileged credential abuse.
The client seeks an experienced Data Engineer to support the Engineering Team in building a modern, scalable data pipeline infrastructure. This role will focus on migrating existing ETL processes to Apache Airflow and transitioning the data warehouse to Snowflake. The engineer will be responsible for designing architecture, developing DAGs, and integrating data sources critical to operations, development, and security analytics.
The position the client is looking to fill will work under the supervision of the USA client-based PMO in the areas of Data Analysis, Reporting, Database querying, and results presentation to stakeholders. A selected candidate will experience the advantageous privilege of contributing to the distinguished PAM solution company and becoming an inextricable part of the reporting squad. That team is integral to company data analysis, structuring, delivering, and presentation to Senior and VP stakeholders.