Data Architect

CHI Software Top Employer

The CHI Software team is not standing still. We love our job and give it one hundred percent of us! Every new project is a challenge that we face successfully. The only thing that can stop us is… Wait, it’s nothing! The number of projects is growing, and with them, our team too. And now we need a Data Architect.

Must-Have Skills

  • Data Architecture (8+ years, 3+ as architect)
  • Enterprise data architecture β€” conceptual, logical, physical modeling
  • Lakehouse architecture patterns β€” medallion architecture, data mesh principles
  • Hybrid data architecture β€” on-prem ↔ cloud data movement, consistency, governance
  • Data modeling β€” dimensional modeling, Data Vault 2.0, or similar for analytical workloads

 

  • Cloudera / Hadoop Ecosystem (Critical)
  • Cloudera Data Platform (CDP) β€” architecture, administration, security
  • Apache Hive / Iceberg β€” table formats, partitioning strategies, compaction
  • Apache Spark on Cloudera β€” performance tuning, resource management
  • HDFS / Ozone β€” storage architecture, replication, tiering
  • Apache Ranger / Knox β€” security, access control, audit
  • Query & Analytics Layer
  • Starburst / Trino β€” federated query architecture, connector management, caching strategies
  • Tableau β€” data source architecture, extract vs. live connections, semantic layer design
  • SQL optimization for analytical workloads β€” join strategies, materialized views, caching
  • Streaming & Integration
  • Apache Kafka β€” topic design, schema registry (Avro/Protobuf), exactly-once semantics
  • CDC (Change Data Capture) patterns β€” Debezium, Kafka Connect
  • ETL/ELT pipeline design β€” batch and streaming patterns
  • Apache Airflow β€” DAG design for complex data pipelines
  • Governance & Quality
  • Informatica β€” data catalog, data quality, metadata management
  • Data governance frameworks β€” data ownership, stewardship, lineage
  • ABAC implementation for data access (attribute-based access control)
  • Data quality metrics and monitoring
  • Business glossary management

 

  • Cloud Data Architecture (AWS)
  • S3 β€” data lake storage patterns, lifecycle policies, storage classes
  • AWS Glue β€” catalog, ETL jobs
  • Lake Formation β€” fine-grained access control
  • Hybrid integration β€” AWS Direct Connect, data replication strategies
     

Nice-to-Have

  • Telco data models (CDR, network events, subscriber data, billing)
  • Experience migrating from traditional Hadoop to modern lakehouse
  • Data Mesh implementation experience
  • Cost optimization for large-scale data platforms
  • DAMA-DMBOK or equivalent data management certifications
     

Engagement Model

  • Full-time staff augmentation (embedded in team)
  • Business trips to Baku every 2 month
  • Must overlap with Baku working hours (GMT+4)
  • English fluency required

Required languages

English B2 - Upper Intermediate
Published 13 March
8 views
Β·
0 applications
To apply for this and other jobs on Djinni login or signup.
Loading...