We are a highly successful Company with great ambitions. We operate on a very competitive market so every day we are looking for opportunities to be better. To be faster. Even faster. Never stand aside and never afraid to try. Having a lot of ideas we are very open to fresh ones.

We invite those who fired up to:
— Working with big enough datasets (100+ TB) which must be updated at least with hourly frequency;
— DWH components: AWS (S3, Athena, Redshift), GCP(Cloud Storage, BigQuery), PostgreSQL;
— Data collection: Kafka, Google Analytics, Firebase, Appsflyer, Cloudflare, other 3rd party apps;
— Data modelling: building centralised data catalog with well validated and documented data marts;
— Data quality / integrity testing automation;
— Designing and implementing REST based APIs;
— Development and support of ETL / ELT processes;
— Creation and support of project documentation.

Essential professional experience:
— 2+ years experience Python / Data Engineer;
— Hands-on experience with following technologies:
— Designing, implementing RESTful API’s (Aiohttp, Flask, FastAPI);
— Relational databases (PostgreSQL, Microsoft SQL Server);
— Job scheduling, task queues;
— Cloud providers: Google Cloud Platform (Cloud Storage, BigQuery),
— AWS (S3, Athena, Redshift) etc.;
— Linux, Docker;
— Incorporating and utilising BDD / TDD / Unit testing;
— Exceptional problem solving, technical and data analysis skills;
— Extensive knowledge of best practices in software design and design patterns;
— Strong Computer Science fundamentals;
— Knowledge in database theory: types, their pros and cons;
— Knowledge in performance tuning of ETL Jobs, SQL’s, Partitioning, Indexing;
— Hands-on experience with ETL, Data Warehousing tasks;
— Solid understanding of git flow best practices.

Will be a plus:
— Familiarity with WEB / Mobile applications data sources;
— Hands-on experience with following technologies:
— Kubernetes;
— Apache Airflow;
— NoSQL databases (Elasticsearch, Redis, MongoDB);
— Kafka, Kafka Connect, Kafka Streams;
— NoSQL databases: MongoDB, ElasticSearch, Redis;
— IaC: Terraform, Ansible;
— Saleforce: CRM and Marketing platform;
— Graph databases (Neo4j, AgensGraph);
— Near real time data processing;
— Data visualisation tools (Tableau, PowerBI, QlikView, Metabase, Apache — Superset, Grafana, Kibana etc.).

We care of your:
— Medical insurance/Sport compensation;
— Sport club participation (football, running, basketball or swimming clubs);
— 100% paid sick leaves;
— 20 working days of paid vacation.

— Competitive salary and сonstant encouragement for your efforts and contribution;
— Bonuses according to company’s policy;
— Welfare (financial support in critical situation);
— Gifts for significant life events (marriage, childbirth).

Personal and professional growth
— Individual annual training budget with an opportunity to visit paid conferences, training sessions, workshops etc.;
— Free corporate library;
— Opportunity to visit our non-stop internal meetups: open talks, IT Pump, etc. as a participant or a speaker and exchange knowledge;
— A world-class team of T-shaped skilled professionals that share knowledge and support each other.

Leisure time
— Corporate parties and events (Pub Quiz, Carquest, etc.);
— PM Foundation activities (social responsibility events);
— Weekly events aimed at culture, arts, soft skills development

About Parimatch Tech

Parimatch Tech — hi-tech R&D center of the global holding company Parimatch. We are an innovative provider of future-defining tech solutions in the Gaming & Entertainment industry. We are committed to innovation to provide the global community with the highest quality betting products and gaming experience.

Company website:

DOU company page:

Job posted on 31 August 2021
13 views    0 applications

To apply for this and other jobs on Djinni login or signup.