dclogicgroup

ETL Architect (Informatica / Talend / SSIS)

We are seeking an experienced ETL Architect with strong expertise in data integration, ETL pipelines, and data warehousing. The ideal candidate will have hands-on experience with tools such as Informatica PowerCenter/Cloud, Talend, and Microsoft SSIS, and will be responsible for architecting scalable, secure, and high-performing ETL solutions. This role involves collaborating with business stakeholders, data engineers, and BI teams to deliver clean, consistent, and reliable data for analytics, reporting, and enterprise systems.

 

Key Responsibilities

  • Design, architect, and implement ETL pipelines to extract, transform, and load data across multiple sources and targets.
  • Define ETL architecture standards, frameworks, and best practices for performance and scalability.
  • Lead the development of data integration solutions using Informatica, Talend, SSIS, or equivalent ETL tools.
  • Collaborate with business analysts, data engineers, and BI developers to translate business requirements into data models and ETL workflows.
  • Ensure data quality, security, and compliance across all ETL processes.
  • Troubleshoot and optimize ETL jobs for performance, scalability, and reliability.
  • Support data warehouse / data lake design and integration.
  • Manage ETL environments, upgrades, and migration to cloud platforms (AWS, Azure, GCP).
  • Provide mentoring, code reviews, and technical leadership to junior ETL developers.

 

Requirements

  • 7+ years of experience in ETL development, with at least 3+ years in a lead/architect role.
  • Strong expertise in one or more major ETL tools: Informatica (PowerCenter/Cloud), Talend, SSIS.
  • Experience with relational databases (Oracle, SQL Server, PostgreSQL) and data warehousing concepts (Kimball, Inmon).
  • Strong knowledge of SQL, PL/SQL, stored procedures, performance tuning.
  • Familiarity with cloud data integration (AWS Glue, Azure Data Factory, GCP Dataflow/Dataproc).
  • Experience in handling large-scale data migrations, batch and real-time ETL processing.
  • Strong problem-solving, analytical, and architectural design skills.
  • Excellent communication skills, with the ability to engage technical and non-technical stakeholders.

 

Nice to Have

  • Hands-on experience with big data platforms (Hadoop, Spark, Kafka, Databricks).
  • Knowledge of data governance, MDM, and metadata management.
  • Familiarity with API-based integrations and microservices architectures.
  • Prior experience in industries such as banking, insurance, healthcare, or telecom.
  • Certification in Informatica, Talend, or cloud ETL platforms.

Required languages

English C1 - Advanced
Published 26 September
20 views
ยท
2 applications
To apply for this and other jobs on Djinni login or signup.
Loading...