Data Engineer $$$
We are looking for an experienced Data Engineer to design and implement scalable data pipelines, own core storage layers, and help build the foundations of a modern DataOps lifecycle.
Job requirements
- 3+ years of professional experience in Data Engineering / DWH development.
- Strong SQL expertise and solid Python skills (pandas, pyarrow).
- Production experience with Apache Airflow (production DAGs, scheduling, operators).
- Handsβon work with PostgreSQL (schema design, migrations, partitioning) and ClickHouse (MergeTree engines, materialized views, performance tuning).
- Experience with Data warehouse design β layered architecture, data modelling, denormalisation.
- Experience integrating REST APIs and third-party data sources
- Experience with Docker / Docker Compose β containerised services and multi-service environments
- Fluency in English.
Nice to Have:
- Experience with Apache Superset (RLS, dashboard layer).
- Familiarity with AI/LLM tooling for data (NL-to-SQL, vector databases).
- Experience in the Fintech field: FX, brokerage, CRM/BO systems.
- Experience with multi-tenant data architectures.
Job responsibilities
- Design, build, and maintain ETL/ELT pipelines rom multiple product datasources.
- Model and optimise data structures in ClickHouse (views, engines, schema evolution).
- Orchestrate workflows using Apache AirflowCollaborate with BI teams (Superset) and backend teams (FastAPI/Python) to build reliable data products.
- Contribute to architectural decisions.
- Collaborate closely with product managers, BI developers, and DevOps to deliver data products endβtoβend.
Job benefits
β Competitive and Attractive Pay.
β Flexible hours for a better work-life balance.
β Work your way: Hybrid flexibility to fit your life.
β 21 vacation + 7 no-questions-asked sick days per year.
β Career growth: Continuous Development & Performance Reviews.
Required languages
| English | B1 - Intermediate |
π
Average salary range of similar jobs in
analytics β
Loading...