Сompany of our customer is the award winner of Best Fintech Startup and Best Financial Product contests in Germany.

The product, they are creating is an innovative electronic trading marketplace for Asset-Based Financing solutions that connects Buyers, Suppliers, Banks and Institutional Investors. This platform will dramatically change SCF (Supply Chain Financing) processes, providing completely new financing possibilities on the b2b market. Within 6 years, our client managed to create a product that meets high standards of international financial industry, and to acquire and integrate several international corporate clients like Lufthansa, Nestle, Vattenfall and Daimler.

One of the project cornerstones is the effective use of data. Use cases range across a wide spectrum including business intelligence, process automation, recommendation engines and risk scoring. Help us build the data infrastructure today that we need to deliver the data products of tomorrow. You will be working closely with different domains to figure out ways to aquire new customers, scale existing programs and optimize user experience – all through data. As the data expert, you will build a strong relationship with internal and external stakeholders and you are required to deeply understand our current and future challenges. As part of a fast paced tech start-up, you are exepected to gain hands-on experience with advanced open source data technologies, and actively drive the data culture

You role:
Collaborate with other teams in the business domain to design data models and data processing logic which translate operational data to valuable business information
Build and maintain scalable low-latency data warehouse using batch ETL and stream-processing technologies
Collaborate in the development of applications for the purposes of reporting, business intelligence and data analytics product

Requirements:
Strong SQL experience (ad hoc queries, optimization techniques, preferably PostgreSQL);
Python
ETL/microservices,
data processing & visualization (pandas; matplotlib / hvPlot / plotly; jupyter / jupyterhub),
basic web development (Django / Flask / Falcon; SQLAlchemy)
ETL experience / Data Warehouse concepts
Schema Architecture (e.g. Flat-file / Star / Snowflake)
Processing Paradigm ( Batch / mini-batch / streaming / change data capture / lambda / kappa)
orchestration: Airflow / Prefect / Luigi / Jenkins
storage infrastructure for scalable OLAP processing: Postgres / Amazon Redshift
Experience with any batch processing technology: SQL / Talend / Elastic / Spark / Custom Built
Experience with any streaming processing technology: Kafka / Spark Streaming / Storm / Flink
Basic experience with Business Intelligence Visualization Tool (e.g. PowerBI, Tableau )
Good conceptual knowledge on streaming processing (Eventual consistency, duplication handling, data latency, water mark, stateless stream)
Good knowledge of Docker
English upper-intermediate or advanced

Nice to have:
AWS infrastructure
Kubernetes
CI/CD knowledge
Basic Java

Higher Education: Bachelor’s Degree

About Intellias

Intellias is a challenge-driven software engineering company, based in Ukraine (Kyiv, Lviv, Odesa, Kharkiv) and locally represented in Berlin, Germany.
Since 2002, we've been helping leading technology companies from EU and North America to create their software products by building and operating world-class engineering teams in Eastern Europe for them

Company website:
intellias.com

DOU company page:
https://jobs.dou.ua/companies/intellias/

Job posted on 10 November 2020
4 views


Для отклика на эту и другие вакансии на Джинне войдите или зарегистрируйтесь.
Similar jobs

Data Scientist (Advanced Analytics) at SoftServe

Kyiv, Kharkiv, Lviv, Dnipro, remote

Data Analyst at SSA Group

Kyiv, Kharkiv, Lviv, Dnipro


All jobs Data Science Lviv    All jobs Intellias
  Receive new jobs in Telegram